2026-04-20 00:00:07.873869 | Job console starting 2026-04-20 00:00:07.898058 | Updating git repos 2026-04-20 00:00:08.284431 | Cloning repos into workspace 2026-04-20 00:00:08.588703 | Restoring repo states 2026-04-20 00:00:08.621794 | Merging changes 2026-04-20 00:00:08.621816 | Checking out repos 2026-04-20 00:00:09.022199 | Preparing playbooks 2026-04-20 00:00:09.938475 | Running Ansible setup 2026-04-20 00:00:17.528254 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2026-04-20 00:00:19.639089 | 2026-04-20 00:00:19.639205 | PLAY [Base pre] 2026-04-20 00:00:19.675980 | 2026-04-20 00:00:19.676091 | TASK [Setup log path fact] 2026-04-20 00:00:19.715424 | orchestrator | ok 2026-04-20 00:00:19.735774 | 2026-04-20 00:00:19.735890 | TASK [set-zuul-log-path-fact : Set log path for a build] 2026-04-20 00:00:19.783875 | orchestrator | ok 2026-04-20 00:00:19.798816 | 2026-04-20 00:00:19.812917 | TASK [emit-job-header : Print job information] 2026-04-20 00:00:19.850699 | # Job Information 2026-04-20 00:00:19.850936 | Ansible Version: 2.16.14 2026-04-20 00:00:19.850972 | Job: testbed-deploy-stable-in-a-nutshell-with-tempest-ubuntu-24.04 2026-04-20 00:00:19.851002 | Pipeline: periodic-midnight 2026-04-20 00:00:19.851022 | Executor: 521e9411259a 2026-04-20 00:00:19.851039 | Triggered by: https://github.com/osism/testbed 2026-04-20 00:00:19.851057 | Event ID: c9183a6c0842410e96ab26ea3bab16db 2026-04-20 00:00:19.862009 | 2026-04-20 00:00:19.862100 | LOOP [emit-job-header : Print node information] 2026-04-20 00:00:19.986678 | orchestrator | ok: 2026-04-20 00:00:19.986868 | orchestrator | # Node Information 2026-04-20 00:00:19.986936 | orchestrator | Inventory Hostname: orchestrator 2026-04-20 00:00:19.986964 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2026-04-20 00:00:19.986987 | orchestrator | Username: zuul-testbed03 2026-04-20 00:00:19.987009 | orchestrator | Distro: Debian 12.13 2026-04-20 00:00:19.987032 | orchestrator | Provider: static-testbed 2026-04-20 00:00:19.987053 | orchestrator | Region: 2026-04-20 00:00:19.987074 | orchestrator | Label: testbed-orchestrator 2026-04-20 00:00:19.987094 | orchestrator | Product Name: OpenStack Nova 2026-04-20 00:00:19.987113 | orchestrator | Interface IP: 81.163.193.140 2026-04-20 00:00:20.006898 | 2026-04-20 00:00:20.007007 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2026-04-20 00:00:20.953919 | orchestrator -> localhost | changed 2026-04-20 00:00:20.961860 | 2026-04-20 00:00:20.961966 | TASK [log-inventory : Copy ansible inventory to logs dir] 2026-04-20 00:00:23.307237 | orchestrator -> localhost | changed 2026-04-20 00:00:23.320039 | 2026-04-20 00:00:23.320166 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2026-04-20 00:00:23.851933 | orchestrator -> localhost | ok 2026-04-20 00:00:23.857885 | 2026-04-20 00:00:23.857984 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2026-04-20 00:00:23.895241 | orchestrator | ok 2026-04-20 00:00:23.921204 | orchestrator | included: /var/lib/zuul/builds/bf20320bb57b40e38431504705879859/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2026-04-20 00:00:23.931227 | 2026-04-20 00:00:23.931313 | TASK [add-build-sshkey : Create Temp SSH key] 2026-04-20 00:00:27.034898 | orchestrator -> localhost | Generating public/private rsa key pair. 2026-04-20 00:00:27.035090 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/bf20320bb57b40e38431504705879859/work/bf20320bb57b40e38431504705879859_id_rsa 2026-04-20 00:00:27.035122 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/bf20320bb57b40e38431504705879859/work/bf20320bb57b40e38431504705879859_id_rsa.pub 2026-04-20 00:00:27.035143 | orchestrator -> localhost | The key fingerprint is: 2026-04-20 00:00:27.035166 | orchestrator -> localhost | SHA256:BAehzzlJdPUKRGDhyEIyGUg41eVcwm+lBurE7S2y1ec zuul-build-sshkey 2026-04-20 00:00:27.035184 | orchestrator -> localhost | The key's randomart image is: 2026-04-20 00:00:27.035209 | orchestrator -> localhost | +---[RSA 3072]----+ 2026-04-20 00:00:27.035228 | orchestrator -> localhost | |B=o. o@**.. | 2026-04-20 00:00:27.035246 | orchestrator -> localhost | |=+ ..OoB .. | 2026-04-20 00:00:27.035262 | orchestrator -> localhost | | ...+o=ooo . | 2026-04-20 00:00:27.035278 | orchestrator -> localhost | | .++.+=. . | 2026-04-20 00:00:27.035294 | orchestrator -> localhost | | o .*=S . | 2026-04-20 00:00:27.035313 | orchestrator -> localhost | | o +.o . | 2026-04-20 00:00:27.035331 | orchestrator -> localhost | | + . o | 2026-04-20 00:00:27.035348 | orchestrator -> localhost | | . E | 2026-04-20 00:00:27.035365 | orchestrator -> localhost | | | 2026-04-20 00:00:27.035382 | orchestrator -> localhost | +----[SHA256]-----+ 2026-04-20 00:00:27.035425 | orchestrator -> localhost | ok: Runtime: 0:00:02.097762 2026-04-20 00:00:27.043927 | 2026-04-20 00:00:27.044017 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2026-04-20 00:00:27.101647 | orchestrator | ok 2026-04-20 00:00:27.114642 | orchestrator | included: /var/lib/zuul/builds/bf20320bb57b40e38431504705879859/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2026-04-20 00:00:27.131206 | 2026-04-20 00:00:27.131300 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2026-04-20 00:00:27.167097 | orchestrator | skipping: Conditional result was False 2026-04-20 00:00:27.174470 | 2026-04-20 00:00:27.174567 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2026-04-20 00:00:28.010677 | orchestrator | changed 2026-04-20 00:00:28.026202 | 2026-04-20 00:00:28.026291 | TASK [add-build-sshkey : Make sure user has a .ssh] 2026-04-20 00:00:28.324830 | orchestrator | ok 2026-04-20 00:00:28.329826 | 2026-04-20 00:00:28.329903 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2026-04-20 00:00:28.840664 | orchestrator | ok 2026-04-20 00:00:28.849499 | 2026-04-20 00:00:28.849626 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2026-04-20 00:00:29.329197 | orchestrator | ok 2026-04-20 00:00:29.334128 | 2026-04-20 00:00:29.334197 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2026-04-20 00:00:29.391438 | orchestrator | skipping: Conditional result was False 2026-04-20 00:00:29.397912 | 2026-04-20 00:00:29.397991 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2026-04-20 00:00:30.745853 | orchestrator -> localhost | changed 2026-04-20 00:00:30.761728 | 2026-04-20 00:00:30.761825 | TASK [add-build-sshkey : Add back temp key] 2026-04-20 00:00:31.393854 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/bf20320bb57b40e38431504705879859/work/bf20320bb57b40e38431504705879859_id_rsa (zuul-build-sshkey) 2026-04-20 00:00:31.394068 | orchestrator -> localhost | ok: Runtime: 0:00:00.018988 2026-04-20 00:00:31.400471 | 2026-04-20 00:00:31.400578 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2026-04-20 00:00:31.851570 | orchestrator | ok 2026-04-20 00:00:31.856430 | 2026-04-20 00:00:31.856544 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2026-04-20 00:00:31.890416 | orchestrator | skipping: Conditional result was False 2026-04-20 00:00:31.964832 | 2026-04-20 00:00:31.964926 | TASK [start-zuul-console : Start zuul_console daemon.] 2026-04-20 00:00:32.497241 | orchestrator | ok 2026-04-20 00:00:32.514517 | 2026-04-20 00:00:32.514621 | TASK [validate-host : Define zuul_info_dir fact] 2026-04-20 00:00:32.557270 | orchestrator | ok 2026-04-20 00:00:32.578227 | 2026-04-20 00:00:32.579133 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2026-04-20 00:00:33.411646 | orchestrator -> localhost | ok 2026-04-20 00:00:33.418796 | 2026-04-20 00:00:33.418919 | TASK [validate-host : Collect information about the host] 2026-04-20 00:00:35.059333 | orchestrator | ok 2026-04-20 00:00:35.111156 | 2026-04-20 00:00:35.111270 | TASK [validate-host : Sanitize hostname] 2026-04-20 00:00:35.228850 | orchestrator | ok 2026-04-20 00:00:35.233697 | 2026-04-20 00:00:35.233799 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2026-04-20 00:00:36.479296 | orchestrator -> localhost | changed 2026-04-20 00:00:36.484377 | 2026-04-20 00:00:36.484462 | TASK [validate-host : Collect information about zuul worker] 2026-04-20 00:00:37.106325 | orchestrator | ok 2026-04-20 00:00:37.110538 | 2026-04-20 00:00:37.110623 | TASK [validate-host : Write out all zuul information for each host] 2026-04-20 00:00:38.784164 | orchestrator -> localhost | changed 2026-04-20 00:00:38.792589 | 2026-04-20 00:00:38.792671 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2026-04-20 00:00:39.102370 | orchestrator | ok 2026-04-20 00:00:39.107470 | 2026-04-20 00:00:39.107567 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2026-04-20 00:02:02.592728 | orchestrator | changed: 2026-04-20 00:02:02.592961 | orchestrator | .d..t...... src/ 2026-04-20 00:02:02.592997 | orchestrator | .d..t...... src/github.com/ 2026-04-20 00:02:02.593022 | orchestrator | .d..t...... src/github.com/osism/ 2026-04-20 00:02:02.593044 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2026-04-20 00:02:02.593065 | orchestrator | RedHat.yml 2026-04-20 00:02:02.624694 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2026-04-20 00:02:02.624715 | orchestrator | RedHat.yml 2026-04-20 00:02:02.624773 | orchestrator | = 2.2.0"... 2026-04-20 00:02:13.766343 | orchestrator | - Finding latest version of hashicorp/null... 2026-04-20 00:02:13.784177 | orchestrator | - Finding terraform-provider-openstack/openstack versions matching ">= 1.53.0"... 2026-04-20 00:02:13.934561 | orchestrator | - Installing hashicorp/null v3.2.4... 2026-04-20 00:02:14.388638 | orchestrator | - Installed hashicorp/null v3.2.4 (signed, key ID 0C0AF313E5FD9F80) 2026-04-20 00:02:14.458156 | orchestrator | - Installing terraform-provider-openstack/openstack v3.4.0... 2026-04-20 00:02:15.260533 | orchestrator | - Installed terraform-provider-openstack/openstack v3.4.0 (signed, key ID 4F80527A391BEFD2) 2026-04-20 00:02:15.335418 | orchestrator | - Installing hashicorp/local v2.8.0... 2026-04-20 00:02:15.974198 | orchestrator | - Installed hashicorp/local v2.8.0 (signed, key ID 0C0AF313E5FD9F80) 2026-04-20 00:02:15.974279 | orchestrator | 2026-04-20 00:02:15.974286 | orchestrator | Providers are signed by their developers. 2026-04-20 00:02:15.974291 | orchestrator | If you'd like to know more about provider signing, you can read about it here: 2026-04-20 00:02:15.974296 | orchestrator | https://opentofu.org/docs/cli/plugins/signing/ 2026-04-20 00:02:15.974303 | orchestrator | 2026-04-20 00:02:15.974308 | orchestrator | OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2026-04-20 00:02:15.974312 | orchestrator | selections it made above. Include this file in your version control repository 2026-04-20 00:02:15.974327 | orchestrator | so that OpenTofu can guarantee to make the same selections by default when 2026-04-20 00:02:15.974332 | orchestrator | you run "tofu init" in the future. 2026-04-20 00:02:15.974697 | orchestrator | 2026-04-20 00:02:15.974704 | orchestrator | OpenTofu has been successfully initialized! 2026-04-20 00:02:15.974707 | orchestrator | 2026-04-20 00:02:15.974711 | orchestrator | You may now begin working with OpenTofu. Try running "tofu plan" to see 2026-04-20 00:02:15.974715 | orchestrator | any changes that are required for your infrastructure. All OpenTofu commands 2026-04-20 00:02:15.974719 | orchestrator | should now work. 2026-04-20 00:02:15.974723 | orchestrator | 2026-04-20 00:02:15.974727 | orchestrator | If you ever set or change modules or backend configuration for OpenTofu, 2026-04-20 00:02:15.974731 | orchestrator | rerun this command to reinitialize your working directory. If you forget, other 2026-04-20 00:02:15.974736 | orchestrator | commands will detect it and remind you to do so if necessary. 2026-04-20 00:02:16.149811 | orchestrator | Created and switched to workspace "ci"! 2026-04-20 00:02:16.149874 | orchestrator | 2026-04-20 00:02:16.149881 | orchestrator | You're now on a new, empty workspace. Workspaces isolate their state, 2026-04-20 00:02:16.149886 | orchestrator | so if you run "tofu plan" OpenTofu will not see any existing state 2026-04-20 00:02:16.149891 | orchestrator | for this configuration. 2026-04-20 00:02:16.691365 | orchestrator | ci.auto.tfvars 2026-04-20 00:02:16.994855 | orchestrator | default_custom.tf 2026-04-20 00:02:18.540299 | orchestrator | data.openstack_networking_network_v2.public: Reading... 2026-04-20 00:02:19.187251 | orchestrator | data.openstack_networking_network_v2.public: Read complete after 0s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2026-04-20 00:02:19.874295 | orchestrator | 2026-04-20 00:02:19.874362 | orchestrator | OpenTofu used the selected providers to generate the following execution 2026-04-20 00:02:19.874371 | orchestrator | plan. Resource actions are indicated with the following symbols: 2026-04-20 00:02:19.874394 | orchestrator | + create 2026-04-20 00:02:19.874409 | orchestrator | <= read (data resources) 2026-04-20 00:02:19.874421 | orchestrator | 2026-04-20 00:02:19.874426 | orchestrator | OpenTofu will perform the following actions: 2026-04-20 00:02:19.874526 | orchestrator | 2026-04-20 00:02:19.874541 | orchestrator | # data.openstack_images_image_v2.image will be read during apply 2026-04-20 00:02:19.874546 | orchestrator | # (config refers to values not yet known) 2026-04-20 00:02:19.874550 | orchestrator | <= data "openstack_images_image_v2" "image" { 2026-04-20 00:02:19.874554 | orchestrator | + checksum = (known after apply) 2026-04-20 00:02:19.874559 | orchestrator | + created_at = (known after apply) 2026-04-20 00:02:19.874563 | orchestrator | + file = (known after apply) 2026-04-20 00:02:19.874567 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.874589 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.874593 | orchestrator | + min_disk_gb = (known after apply) 2026-04-20 00:02:19.874597 | orchestrator | + min_ram_mb = (known after apply) 2026-04-20 00:02:19.874601 | orchestrator | + most_recent = true 2026-04-20 00:02:19.874606 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.874609 | orchestrator | + protected = (known after apply) 2026-04-20 00:02:19.874613 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.874620 | orchestrator | + schema = (known after apply) 2026-04-20 00:02:19.874624 | orchestrator | + size_bytes = (known after apply) 2026-04-20 00:02:19.874628 | orchestrator | + tags = (known after apply) 2026-04-20 00:02:19.874631 | orchestrator | + updated_at = (known after apply) 2026-04-20 00:02:19.874636 | orchestrator | } 2026-04-20 00:02:19.874714 | orchestrator | 2026-04-20 00:02:19.874725 | orchestrator | # data.openstack_images_image_v2.image_node will be read during apply 2026-04-20 00:02:19.874730 | orchestrator | # (config refers to values not yet known) 2026-04-20 00:02:19.874734 | orchestrator | <= data "openstack_images_image_v2" "image_node" { 2026-04-20 00:02:19.874739 | orchestrator | + checksum = (known after apply) 2026-04-20 00:02:19.874742 | orchestrator | + created_at = (known after apply) 2026-04-20 00:02:19.874746 | orchestrator | + file = (known after apply) 2026-04-20 00:02:19.874750 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.874754 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.874757 | orchestrator | + min_disk_gb = (known after apply) 2026-04-20 00:02:19.874761 | orchestrator | + min_ram_mb = (known after apply) 2026-04-20 00:02:19.874765 | orchestrator | + most_recent = true 2026-04-20 00:02:19.874769 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.874772 | orchestrator | + protected = (known after apply) 2026-04-20 00:02:19.874776 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.874780 | orchestrator | + schema = (known after apply) 2026-04-20 00:02:19.874784 | orchestrator | + size_bytes = (known after apply) 2026-04-20 00:02:19.874788 | orchestrator | + tags = (known after apply) 2026-04-20 00:02:19.874791 | orchestrator | + updated_at = (known after apply) 2026-04-20 00:02:19.874795 | orchestrator | } 2026-04-20 00:02:19.874867 | orchestrator | 2026-04-20 00:02:19.874878 | orchestrator | # local_file.MANAGER_ADDRESS will be created 2026-04-20 00:02:19.874883 | orchestrator | + resource "local_file" "MANAGER_ADDRESS" { 2026-04-20 00:02:19.874887 | orchestrator | + content = (known after apply) 2026-04-20 00:02:19.874891 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-20 00:02:19.874895 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-20 00:02:19.874899 | orchestrator | + content_md5 = (known after apply) 2026-04-20 00:02:19.874902 | orchestrator | + content_sha1 = (known after apply) 2026-04-20 00:02:19.874906 | orchestrator | + content_sha256 = (known after apply) 2026-04-20 00:02:19.874910 | orchestrator | + content_sha512 = (known after apply) 2026-04-20 00:02:19.874914 | orchestrator | + directory_permission = "0777" 2026-04-20 00:02:19.874918 | orchestrator | + file_permission = "0644" 2026-04-20 00:02:19.874921 | orchestrator | + filename = ".MANAGER_ADDRESS.ci" 2026-04-20 00:02:19.874925 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.874929 | orchestrator | } 2026-04-20 00:02:19.874995 | orchestrator | 2026-04-20 00:02:19.875005 | orchestrator | # local_file.id_rsa_pub will be created 2026-04-20 00:02:19.875010 | orchestrator | + resource "local_file" "id_rsa_pub" { 2026-04-20 00:02:19.875013 | orchestrator | + content = (known after apply) 2026-04-20 00:02:19.875017 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-20 00:02:19.875021 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-20 00:02:19.875025 | orchestrator | + content_md5 = (known after apply) 2026-04-20 00:02:19.875028 | orchestrator | + content_sha1 = (known after apply) 2026-04-20 00:02:19.875032 | orchestrator | + content_sha256 = (known after apply) 2026-04-20 00:02:19.875036 | orchestrator | + content_sha512 = (known after apply) 2026-04-20 00:02:19.875040 | orchestrator | + directory_permission = "0777" 2026-04-20 00:02:19.875043 | orchestrator | + file_permission = "0644" 2026-04-20 00:02:19.875052 | orchestrator | + filename = ".id_rsa.ci.pub" 2026-04-20 00:02:19.875056 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875060 | orchestrator | } 2026-04-20 00:02:19.875126 | orchestrator | 2026-04-20 00:02:19.875144 | orchestrator | # local_file.inventory will be created 2026-04-20 00:02:19.875148 | orchestrator | + resource "local_file" "inventory" { 2026-04-20 00:02:19.875152 | orchestrator | + content = (known after apply) 2026-04-20 00:02:19.875156 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-20 00:02:19.875160 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-20 00:02:19.875163 | orchestrator | + content_md5 = (known after apply) 2026-04-20 00:02:19.875167 | orchestrator | + content_sha1 = (known after apply) 2026-04-20 00:02:19.875171 | orchestrator | + content_sha256 = (known after apply) 2026-04-20 00:02:19.875175 | orchestrator | + content_sha512 = (known after apply) 2026-04-20 00:02:19.875179 | orchestrator | + directory_permission = "0777" 2026-04-20 00:02:19.875182 | orchestrator | + file_permission = "0644" 2026-04-20 00:02:19.875186 | orchestrator | + filename = "inventory.ci" 2026-04-20 00:02:19.875190 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875194 | orchestrator | } 2026-04-20 00:02:19.875276 | orchestrator | 2026-04-20 00:02:19.875288 | orchestrator | # local_sensitive_file.id_rsa will be created 2026-04-20 00:02:19.875292 | orchestrator | + resource "local_sensitive_file" "id_rsa" { 2026-04-20 00:02:19.875296 | orchestrator | + content = (sensitive value) 2026-04-20 00:02:19.875300 | orchestrator | + content_base64sha256 = (known after apply) 2026-04-20 00:02:19.875303 | orchestrator | + content_base64sha512 = (known after apply) 2026-04-20 00:02:19.875307 | orchestrator | + content_md5 = (known after apply) 2026-04-20 00:02:19.875311 | orchestrator | + content_sha1 = (known after apply) 2026-04-20 00:02:19.875314 | orchestrator | + content_sha256 = (known after apply) 2026-04-20 00:02:19.875318 | orchestrator | + content_sha512 = (known after apply) 2026-04-20 00:02:19.875322 | orchestrator | + directory_permission = "0700" 2026-04-20 00:02:19.875326 | orchestrator | + file_permission = "0600" 2026-04-20 00:02:19.875329 | orchestrator | + filename = ".id_rsa.ci" 2026-04-20 00:02:19.875333 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875337 | orchestrator | } 2026-04-20 00:02:19.875357 | orchestrator | 2026-04-20 00:02:19.875367 | orchestrator | # null_resource.node_semaphore will be created 2026-04-20 00:02:19.875372 | orchestrator | + resource "null_resource" "node_semaphore" { 2026-04-20 00:02:19.875376 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875379 | orchestrator | } 2026-04-20 00:02:19.875443 | orchestrator | 2026-04-20 00:02:19.875454 | orchestrator | # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2026-04-20 00:02:19.875459 | orchestrator | + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2026-04-20 00:02:19.875462 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.875466 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.875470 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875474 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.875477 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.875481 | orchestrator | + name = "testbed-volume-manager-base" 2026-04-20 00:02:19.875485 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.875489 | orchestrator | + size = 80 2026-04-20 00:02:19.875492 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.875496 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.875500 | orchestrator | } 2026-04-20 00:02:19.875563 | orchestrator | 2026-04-20 00:02:19.875575 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2026-04-20 00:02:19.875579 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-20 00:02:19.875583 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.875587 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.875591 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875599 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.875602 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.875606 | orchestrator | + name = "testbed-volume-0-node-base" 2026-04-20 00:02:19.875610 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.875613 | orchestrator | + size = 80 2026-04-20 00:02:19.875617 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.875621 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.875625 | orchestrator | } 2026-04-20 00:02:19.875686 | orchestrator | 2026-04-20 00:02:19.875697 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2026-04-20 00:02:19.875701 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-20 00:02:19.875705 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.875709 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.875713 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875716 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.875720 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.875724 | orchestrator | + name = "testbed-volume-1-node-base" 2026-04-20 00:02:19.875727 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.875731 | orchestrator | + size = 80 2026-04-20 00:02:19.875735 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.875739 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.875742 | orchestrator | } 2026-04-20 00:02:19.875800 | orchestrator | 2026-04-20 00:02:19.875810 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2026-04-20 00:02:19.875815 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-20 00:02:19.875818 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.875822 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.875826 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875830 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.875833 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.875837 | orchestrator | + name = "testbed-volume-2-node-base" 2026-04-20 00:02:19.875841 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.875844 | orchestrator | + size = 80 2026-04-20 00:02:19.875848 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.875852 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.875856 | orchestrator | } 2026-04-20 00:02:19.875950 | orchestrator | 2026-04-20 00:02:19.875968 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2026-04-20 00:02:19.875975 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-20 00:02:19.875982 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.875986 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.875990 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.875993 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.875997 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876005 | orchestrator | + name = "testbed-volume-3-node-base" 2026-04-20 00:02:19.876009 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876013 | orchestrator | + size = 80 2026-04-20 00:02:19.876017 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876020 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876024 | orchestrator | } 2026-04-20 00:02:19.876088 | orchestrator | 2026-04-20 00:02:19.876099 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2026-04-20 00:02:19.876103 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-20 00:02:19.876107 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876111 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876115 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876123 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.876127 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876130 | orchestrator | + name = "testbed-volume-4-node-base" 2026-04-20 00:02:19.876134 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876138 | orchestrator | + size = 80 2026-04-20 00:02:19.876142 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876145 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876149 | orchestrator | } 2026-04-20 00:02:19.876240 | orchestrator | 2026-04-20 00:02:19.876253 | orchestrator | # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2026-04-20 00:02:19.876257 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2026-04-20 00:02:19.876261 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876265 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876268 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876272 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.876276 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876279 | orchestrator | + name = "testbed-volume-5-node-base" 2026-04-20 00:02:19.876283 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876287 | orchestrator | + size = 80 2026-04-20 00:02:19.876290 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876294 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876298 | orchestrator | } 2026-04-20 00:02:19.876359 | orchestrator | 2026-04-20 00:02:19.876370 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[0] will be created 2026-04-20 00:02:19.876375 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.876379 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876382 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876386 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876390 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876394 | orchestrator | + name = "testbed-volume-0-node-3" 2026-04-20 00:02:19.876398 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876402 | orchestrator | + size = 20 2026-04-20 00:02:19.876405 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876409 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876413 | orchestrator | } 2026-04-20 00:02:19.876470 | orchestrator | 2026-04-20 00:02:19.876481 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[1] will be created 2026-04-20 00:02:19.876485 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.876489 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876492 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876496 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876500 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876503 | orchestrator | + name = "testbed-volume-1-node-4" 2026-04-20 00:02:19.876507 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876511 | orchestrator | + size = 20 2026-04-20 00:02:19.876515 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876518 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876522 | orchestrator | } 2026-04-20 00:02:19.876581 | orchestrator | 2026-04-20 00:02:19.876592 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[2] will be created 2026-04-20 00:02:19.876596 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.876600 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876604 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876607 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876611 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876615 | orchestrator | + name = "testbed-volume-2-node-5" 2026-04-20 00:02:19.876618 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876626 | orchestrator | + size = 20 2026-04-20 00:02:19.876630 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876634 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876637 | orchestrator | } 2026-04-20 00:02:19.876693 | orchestrator | 2026-04-20 00:02:19.876704 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[3] will be created 2026-04-20 00:02:19.876708 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.876712 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876715 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876719 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876723 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876727 | orchestrator | + name = "testbed-volume-3-node-3" 2026-04-20 00:02:19.876730 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876734 | orchestrator | + size = 20 2026-04-20 00:02:19.876738 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876741 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876745 | orchestrator | } 2026-04-20 00:02:19.876799 | orchestrator | 2026-04-20 00:02:19.876809 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[4] will be created 2026-04-20 00:02:19.876814 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.876817 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876821 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876825 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876828 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876832 | orchestrator | + name = "testbed-volume-4-node-4" 2026-04-20 00:02:19.876836 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876844 | orchestrator | + size = 20 2026-04-20 00:02:19.876848 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876851 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876855 | orchestrator | } 2026-04-20 00:02:19.876911 | orchestrator | 2026-04-20 00:02:19.876922 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[5] will be created 2026-04-20 00:02:19.876926 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.876930 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.876933 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.876937 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.876941 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.876945 | orchestrator | + name = "testbed-volume-5-node-5" 2026-04-20 00:02:19.876948 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.876952 | orchestrator | + size = 20 2026-04-20 00:02:19.876956 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.876959 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.876963 | orchestrator | } 2026-04-20 00:02:19.877014 | orchestrator | 2026-04-20 00:02:19.877025 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[6] will be created 2026-04-20 00:02:19.877030 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.877033 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.877037 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.877041 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.877044 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.877048 | orchestrator | + name = "testbed-volume-6-node-3" 2026-04-20 00:02:19.877052 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.877055 | orchestrator | + size = 20 2026-04-20 00:02:19.877059 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.877063 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.877066 | orchestrator | } 2026-04-20 00:02:19.877118 | orchestrator | 2026-04-20 00:02:19.877129 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[7] will be created 2026-04-20 00:02:19.877133 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.877140 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.877144 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.877147 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.877151 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.877155 | orchestrator | + name = "testbed-volume-7-node-4" 2026-04-20 00:02:19.877159 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.877162 | orchestrator | + size = 20 2026-04-20 00:02:19.877166 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.877170 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.877174 | orchestrator | } 2026-04-20 00:02:19.877239 | orchestrator | 2026-04-20 00:02:19.877251 | orchestrator | # openstack_blockstorage_volume_v3.node_volume[8] will be created 2026-04-20 00:02:19.877255 | orchestrator | + resource "openstack_blockstorage_volume_v3" "node_volume" { 2026-04-20 00:02:19.877259 | orchestrator | + attachment = (known after apply) 2026-04-20 00:02:19.877262 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.877266 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.877270 | orchestrator | + metadata = (known after apply) 2026-04-20 00:02:19.877273 | orchestrator | + name = "testbed-volume-8-node-5" 2026-04-20 00:02:19.877277 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.877281 | orchestrator | + size = 20 2026-04-20 00:02:19.877284 | orchestrator | + volume_retype_policy = "never" 2026-04-20 00:02:19.877288 | orchestrator | + volume_type = "ssd" 2026-04-20 00:02:19.877292 | orchestrator | } 2026-04-20 00:02:19.877476 | orchestrator | 2026-04-20 00:02:19.877489 | orchestrator | # openstack_compute_instance_v2.manager_server will be created 2026-04-20 00:02:19.877493 | orchestrator | + resource "openstack_compute_instance_v2" "manager_server" { 2026-04-20 00:02:19.877497 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.877500 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.877504 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.877508 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.877512 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.877515 | orchestrator | + config_drive = true 2026-04-20 00:02:19.877519 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.877523 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.877527 | orchestrator | + flavor_name = "OSISM-4V-16" 2026-04-20 00:02:19.877531 | orchestrator | + force_delete = false 2026-04-20 00:02:19.877534 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.877538 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.877541 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.877545 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.877549 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.877553 | orchestrator | + name = "testbed-manager" 2026-04-20 00:02:19.877557 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.877561 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.877564 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.877568 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.877572 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.877576 | orchestrator | + user_data = (sensitive value) 2026-04-20 00:02:19.877579 | orchestrator | 2026-04-20 00:02:19.877583 | orchestrator | + block_device { 2026-04-20 00:02:19.877587 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.877591 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.877598 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.877602 | orchestrator | + multiattach = false 2026-04-20 00:02:19.877605 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.877609 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.877616 | orchestrator | } 2026-04-20 00:02:19.877620 | orchestrator | 2026-04-20 00:02:19.877626 | orchestrator | + network { 2026-04-20 00:02:19.877632 | orchestrator | + access_network = false 2026-04-20 00:02:19.877638 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.877645 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.877649 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.877653 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.877657 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.877660 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.877664 | orchestrator | } 2026-04-20 00:02:19.877668 | orchestrator | } 2026-04-20 00:02:19.877852 | orchestrator | 2026-04-20 00:02:19.877864 | orchestrator | # openstack_compute_instance_v2.node_server[0] will be created 2026-04-20 00:02:19.877868 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-20 00:02:19.877872 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.877876 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.877879 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.877883 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.877887 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.877890 | orchestrator | + config_drive = true 2026-04-20 00:02:19.877894 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.877898 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.877901 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-20 00:02:19.877905 | orchestrator | + force_delete = false 2026-04-20 00:02:19.877909 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.877913 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.877916 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.877920 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.877924 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.877927 | orchestrator | + name = "testbed-node-0" 2026-04-20 00:02:19.877931 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.877935 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.877938 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.877942 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.877946 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.877949 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-20 00:02:19.877953 | orchestrator | 2026-04-20 00:02:19.877957 | orchestrator | + block_device { 2026-04-20 00:02:19.877960 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.877964 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.877968 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.877971 | orchestrator | + multiattach = false 2026-04-20 00:02:19.877975 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.877979 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.877982 | orchestrator | } 2026-04-20 00:02:19.877986 | orchestrator | 2026-04-20 00:02:19.877990 | orchestrator | + network { 2026-04-20 00:02:19.877994 | orchestrator | + access_network = false 2026-04-20 00:02:19.877997 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.878001 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.878005 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.878008 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.878030 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.878035 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.878039 | orchestrator | } 2026-04-20 00:02:19.878042 | orchestrator | } 2026-04-20 00:02:19.878241 | orchestrator | 2026-04-20 00:02:19.878254 | orchestrator | # openstack_compute_instance_v2.node_server[1] will be created 2026-04-20 00:02:19.878259 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-20 00:02:19.878262 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.878271 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.878274 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.878278 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.878281 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.878285 | orchestrator | + config_drive = true 2026-04-20 00:02:19.878289 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.878292 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.878296 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-20 00:02:19.878300 | orchestrator | + force_delete = false 2026-04-20 00:02:19.878303 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.878307 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.878310 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.878314 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.878318 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.878321 | orchestrator | + name = "testbed-node-1" 2026-04-20 00:02:19.878325 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.878329 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.878332 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.878336 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.878340 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.878343 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-20 00:02:19.878347 | orchestrator | 2026-04-20 00:02:19.878351 | orchestrator | + block_device { 2026-04-20 00:02:19.878354 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.878358 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.878362 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.878365 | orchestrator | + multiattach = false 2026-04-20 00:02:19.878369 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.878372 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.878376 | orchestrator | } 2026-04-20 00:02:19.878380 | orchestrator | 2026-04-20 00:02:19.878383 | orchestrator | + network { 2026-04-20 00:02:19.878387 | orchestrator | + access_network = false 2026-04-20 00:02:19.878391 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.878394 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.878398 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.878402 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.878405 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.878409 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.878413 | orchestrator | } 2026-04-20 00:02:19.878416 | orchestrator | } 2026-04-20 00:02:19.878598 | orchestrator | 2026-04-20 00:02:19.878610 | orchestrator | # openstack_compute_instance_v2.node_server[2] will be created 2026-04-20 00:02:19.878614 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-20 00:02:19.878618 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.878622 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.878626 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.878630 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.878636 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.878640 | orchestrator | + config_drive = true 2026-04-20 00:02:19.878644 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.878648 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.878651 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-20 00:02:19.878655 | orchestrator | + force_delete = false 2026-04-20 00:02:19.878659 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.878662 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.878666 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.878677 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.878681 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.878684 | orchestrator | + name = "testbed-node-2" 2026-04-20 00:02:19.878688 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.878692 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.878695 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.878699 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.878703 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.878707 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-20 00:02:19.878710 | orchestrator | 2026-04-20 00:02:19.878714 | orchestrator | + block_device { 2026-04-20 00:02:19.878718 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.878721 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.878725 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.878728 | orchestrator | + multiattach = false 2026-04-20 00:02:19.878732 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.878736 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.878739 | orchestrator | } 2026-04-20 00:02:19.878743 | orchestrator | 2026-04-20 00:02:19.878747 | orchestrator | + network { 2026-04-20 00:02:19.878751 | orchestrator | + access_network = false 2026-04-20 00:02:19.878754 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.878758 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.878762 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.878765 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.878769 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.878773 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.878776 | orchestrator | } 2026-04-20 00:02:19.878780 | orchestrator | } 2026-04-20 00:02:19.878958 | orchestrator | 2026-04-20 00:02:19.878970 | orchestrator | # openstack_compute_instance_v2.node_server[3] will be created 2026-04-20 00:02:19.878974 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-20 00:02:19.878978 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.878982 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.878985 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.878989 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.878993 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.878996 | orchestrator | + config_drive = true 2026-04-20 00:02:19.879000 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.879004 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.879007 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-20 00:02:19.879011 | orchestrator | + force_delete = false 2026-04-20 00:02:19.879015 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.879018 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.879022 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.879026 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.879029 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.879033 | orchestrator | + name = "testbed-node-3" 2026-04-20 00:02:19.879037 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.879040 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.879044 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.879048 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.879051 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.879055 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-20 00:02:19.879059 | orchestrator | 2026-04-20 00:02:19.879063 | orchestrator | + block_device { 2026-04-20 00:02:19.879069 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.879073 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.879076 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.879084 | orchestrator | + multiattach = false 2026-04-20 00:02:19.879087 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.879091 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.879095 | orchestrator | } 2026-04-20 00:02:19.879098 | orchestrator | 2026-04-20 00:02:19.879102 | orchestrator | + network { 2026-04-20 00:02:19.879106 | orchestrator | + access_network = false 2026-04-20 00:02:19.879109 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.879113 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.879117 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.879120 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.879124 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.879128 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.879131 | orchestrator | } 2026-04-20 00:02:19.879135 | orchestrator | } 2026-04-20 00:02:19.879319 | orchestrator | 2026-04-20 00:02:19.879332 | orchestrator | # openstack_compute_instance_v2.node_server[4] will be created 2026-04-20 00:02:19.879336 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-20 00:02:19.879340 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.879343 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.879347 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.879351 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.879354 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.879358 | orchestrator | + config_drive = true 2026-04-20 00:02:19.879362 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.879365 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.879369 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-20 00:02:19.879373 | orchestrator | + force_delete = false 2026-04-20 00:02:19.879377 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.879380 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.879384 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.879388 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.879391 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.879395 | orchestrator | + name = "testbed-node-4" 2026-04-20 00:02:19.879399 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.879402 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.879406 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.879410 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.879414 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.879417 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-20 00:02:19.879421 | orchestrator | 2026-04-20 00:02:19.879425 | orchestrator | + block_device { 2026-04-20 00:02:19.879428 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.879432 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.879436 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.879439 | orchestrator | + multiattach = false 2026-04-20 00:02:19.879443 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.879447 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.879450 | orchestrator | } 2026-04-20 00:02:19.879454 | orchestrator | 2026-04-20 00:02:19.879458 | orchestrator | + network { 2026-04-20 00:02:19.879461 | orchestrator | + access_network = false 2026-04-20 00:02:19.879465 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.879469 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.879472 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.879476 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.879480 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.879483 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.879487 | orchestrator | } 2026-04-20 00:02:19.879491 | orchestrator | } 2026-04-20 00:02:19.879675 | orchestrator | 2026-04-20 00:02:19.879687 | orchestrator | # openstack_compute_instance_v2.node_server[5] will be created 2026-04-20 00:02:19.879691 | orchestrator | + resource "openstack_compute_instance_v2" "node_server" { 2026-04-20 00:02:19.879695 | orchestrator | + access_ip_v4 = (known after apply) 2026-04-20 00:02:19.879699 | orchestrator | + access_ip_v6 = (known after apply) 2026-04-20 00:02:19.879703 | orchestrator | + all_metadata = (known after apply) 2026-04-20 00:02:19.879706 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.879710 | orchestrator | + availability_zone = "nova" 2026-04-20 00:02:19.879714 | orchestrator | + config_drive = true 2026-04-20 00:02:19.879717 | orchestrator | + created = (known after apply) 2026-04-20 00:02:19.879721 | orchestrator | + flavor_id = (known after apply) 2026-04-20 00:02:19.879725 | orchestrator | + flavor_name = "OSISM-8V-32" 2026-04-20 00:02:19.879728 | orchestrator | + force_delete = false 2026-04-20 00:02:19.879735 | orchestrator | + hypervisor_hostname = (known after apply) 2026-04-20 00:02:19.879739 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.879742 | orchestrator | + image_id = (known after apply) 2026-04-20 00:02:19.879746 | orchestrator | + image_name = (known after apply) 2026-04-20 00:02:19.879750 | orchestrator | + key_pair = "testbed" 2026-04-20 00:02:19.879753 | orchestrator | + name = "testbed-node-5" 2026-04-20 00:02:19.879757 | orchestrator | + power_state = "active" 2026-04-20 00:02:19.879761 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.879764 | orchestrator | + security_groups = (known after apply) 2026-04-20 00:02:19.879768 | orchestrator | + stop_before_destroy = false 2026-04-20 00:02:19.879772 | orchestrator | + updated = (known after apply) 2026-04-20 00:02:19.879776 | orchestrator | + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2026-04-20 00:02:19.879779 | orchestrator | 2026-04-20 00:02:19.879783 | orchestrator | + block_device { 2026-04-20 00:02:19.879787 | orchestrator | + boot_index = 0 2026-04-20 00:02:19.879790 | orchestrator | + delete_on_termination = false 2026-04-20 00:02:19.879794 | orchestrator | + destination_type = "volume" 2026-04-20 00:02:19.879798 | orchestrator | + multiattach = false 2026-04-20 00:02:19.879801 | orchestrator | + source_type = "volume" 2026-04-20 00:02:19.879805 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.879808 | orchestrator | } 2026-04-20 00:02:19.879812 | orchestrator | 2026-04-20 00:02:19.879816 | orchestrator | + network { 2026-04-20 00:02:19.879820 | orchestrator | + access_network = false 2026-04-20 00:02:19.879823 | orchestrator | + fixed_ip_v4 = (known after apply) 2026-04-20 00:02:19.879827 | orchestrator | + fixed_ip_v6 = (known after apply) 2026-04-20 00:02:19.879831 | orchestrator | + mac = (known after apply) 2026-04-20 00:02:19.879834 | orchestrator | + name = (known after apply) 2026-04-20 00:02:19.879838 | orchestrator | + port = (known after apply) 2026-04-20 00:02:19.879842 | orchestrator | + uuid = (known after apply) 2026-04-20 00:02:19.879846 | orchestrator | } 2026-04-20 00:02:19.879849 | orchestrator | } 2026-04-20 00:02:19.879894 | orchestrator | 2026-04-20 00:02:19.879904 | orchestrator | # openstack_compute_keypair_v2.key will be created 2026-04-20 00:02:19.879909 | orchestrator | + resource "openstack_compute_keypair_v2" "key" { 2026-04-20 00:02:19.879913 | orchestrator | + fingerprint = (known after apply) 2026-04-20 00:02:19.879917 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.879920 | orchestrator | + name = "testbed" 2026-04-20 00:02:19.879924 | orchestrator | + private_key = (sensitive value) 2026-04-20 00:02:19.879928 | orchestrator | + public_key = (known after apply) 2026-04-20 00:02:19.879932 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.879935 | orchestrator | + user_id = (known after apply) 2026-04-20 00:02:19.879939 | orchestrator | } 2026-04-20 00:02:19.879975 | orchestrator | 2026-04-20 00:02:19.879986 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2026-04-20 00:02:19.879991 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.879999 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880003 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880006 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880010 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880014 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880017 | orchestrator | } 2026-04-20 00:02:19.880051 | orchestrator | 2026-04-20 00:02:19.880062 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2026-04-20 00:02:19.880067 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880071 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880074 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880078 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880082 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880086 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880089 | orchestrator | } 2026-04-20 00:02:19.880123 | orchestrator | 2026-04-20 00:02:19.880134 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2026-04-20 00:02:19.880138 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880142 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880146 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880149 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880153 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880157 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880160 | orchestrator | } 2026-04-20 00:02:19.880194 | orchestrator | 2026-04-20 00:02:19.880205 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2026-04-20 00:02:19.880238 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880242 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880246 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880250 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880253 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880257 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880261 | orchestrator | } 2026-04-20 00:02:19.880295 | orchestrator | 2026-04-20 00:02:19.880306 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2026-04-20 00:02:19.880310 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880314 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880318 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880321 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880328 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880332 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880336 | orchestrator | } 2026-04-20 00:02:19.880370 | orchestrator | 2026-04-20 00:02:19.880380 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2026-04-20 00:02:19.880385 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880388 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880392 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880396 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880399 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880403 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880407 | orchestrator | } 2026-04-20 00:02:19.880443 | orchestrator | 2026-04-20 00:02:19.880453 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2026-04-20 00:02:19.880458 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880461 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880465 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880469 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880472 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880480 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880483 | orchestrator | } 2026-04-20 00:02:19.880517 | orchestrator | 2026-04-20 00:02:19.880527 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2026-04-20 00:02:19.880532 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880535 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880539 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880543 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880547 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880551 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880554 | orchestrator | } 2026-04-20 00:02:19.880587 | orchestrator | 2026-04-20 00:02:19.880597 | orchestrator | # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2026-04-20 00:02:19.880602 | orchestrator | + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2026-04-20 00:02:19.880605 | orchestrator | + device = (known after apply) 2026-04-20 00:02:19.880609 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880613 | orchestrator | + instance_id = (known after apply) 2026-04-20 00:02:19.880617 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880620 | orchestrator | + volume_id = (known after apply) 2026-04-20 00:02:19.880624 | orchestrator | } 2026-04-20 00:02:19.880656 | orchestrator | 2026-04-20 00:02:19.880666 | orchestrator | # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2026-04-20 00:02:19.880672 | orchestrator | + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2026-04-20 00:02:19.880675 | orchestrator | + fixed_ip = (known after apply) 2026-04-20 00:02:19.880679 | orchestrator | + floating_ip = (known after apply) 2026-04-20 00:02:19.880683 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880687 | orchestrator | + port_id = (known after apply) 2026-04-20 00:02:19.880690 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880694 | orchestrator | } 2026-04-20 00:02:19.880752 | orchestrator | 2026-04-20 00:02:19.880763 | orchestrator | # openstack_networking_floatingip_v2.manager_floating_ip will be created 2026-04-20 00:02:19.880768 | orchestrator | + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2026-04-20 00:02:19.880771 | orchestrator | + address = (known after apply) 2026-04-20 00:02:19.880775 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.880779 | orchestrator | + dns_domain = (known after apply) 2026-04-20 00:02:19.880783 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.880787 | orchestrator | + fixed_ip = (known after apply) 2026-04-20 00:02:19.880790 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880794 | orchestrator | + pool = "public" 2026-04-20 00:02:19.880798 | orchestrator | + port_id = (known after apply) 2026-04-20 00:02:19.880802 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880805 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.880809 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.880813 | orchestrator | } 2026-04-20 00:02:19.880895 | orchestrator | 2026-04-20 00:02:19.880907 | orchestrator | # openstack_networking_network_v2.net_management will be created 2026-04-20 00:02:19.880911 | orchestrator | + resource "openstack_networking_network_v2" "net_management" { 2026-04-20 00:02:19.880915 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.880918 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.880922 | orchestrator | + availability_zone_hints = [ 2026-04-20 00:02:19.880926 | orchestrator | + "nova", 2026-04-20 00:02:19.880930 | orchestrator | ] 2026-04-20 00:02:19.880933 | orchestrator | + dns_domain = (known after apply) 2026-04-20 00:02:19.880937 | orchestrator | + external = (known after apply) 2026-04-20 00:02:19.880941 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.880944 | orchestrator | + mtu = (known after apply) 2026-04-20 00:02:19.880948 | orchestrator | + name = "net-testbed-management" 2026-04-20 00:02:19.880952 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.880959 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.880962 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.880966 | orchestrator | + shared = (known after apply) 2026-04-20 00:02:19.880970 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.880974 | orchestrator | + transparent_vlan = (known after apply) 2026-04-20 00:02:19.880977 | orchestrator | 2026-04-20 00:02:19.880981 | orchestrator | + segments (known after apply) 2026-04-20 00:02:19.880985 | orchestrator | } 2026-04-20 00:02:19.881099 | orchestrator | 2026-04-20 00:02:19.881110 | orchestrator | # openstack_networking_port_v2.manager_port_management will be created 2026-04-20 00:02:19.881114 | orchestrator | + resource "openstack_networking_port_v2" "manager_port_management" { 2026-04-20 00:02:19.881118 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.881122 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.881126 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.881132 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.881136 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.881140 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.881143 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.881147 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.881151 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.881155 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.881158 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.881162 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.881165 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.881169 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.881173 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.881177 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.881180 | orchestrator | 2026-04-20 00:02:19.881184 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881188 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.881191 | orchestrator | } 2026-04-20 00:02:19.881195 | orchestrator | 2026-04-20 00:02:19.881199 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.881203 | orchestrator | 2026-04-20 00:02:19.881220 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.881224 | orchestrator | + ip_address = "192.168.16.5" 2026-04-20 00:02:19.881228 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.881231 | orchestrator | } 2026-04-20 00:02:19.881235 | orchestrator | } 2026-04-20 00:02:19.881363 | orchestrator | 2026-04-20 00:02:19.881375 | orchestrator | # openstack_networking_port_v2.node_port_management[0] will be created 2026-04-20 00:02:19.881380 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-20 00:02:19.881383 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.881387 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.881391 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.881394 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.881398 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.881402 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.881406 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.881409 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.881413 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.881417 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.881421 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.881424 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.881428 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.881432 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.881439 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.881443 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.881447 | orchestrator | 2026-04-20 00:02:19.881450 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881454 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-20 00:02:19.881458 | orchestrator | } 2026-04-20 00:02:19.881462 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881465 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.881469 | orchestrator | } 2026-04-20 00:02:19.881473 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881476 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-20 00:02:19.881480 | orchestrator | } 2026-04-20 00:02:19.881484 | orchestrator | 2026-04-20 00:02:19.881487 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.881491 | orchestrator | 2026-04-20 00:02:19.881495 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.881499 | orchestrator | + ip_address = "192.168.16.10" 2026-04-20 00:02:19.881502 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.881506 | orchestrator | } 2026-04-20 00:02:19.881510 | orchestrator | } 2026-04-20 00:02:19.881636 | orchestrator | 2026-04-20 00:02:19.881648 | orchestrator | # openstack_networking_port_v2.node_port_management[1] will be created 2026-04-20 00:02:19.881652 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-20 00:02:19.881656 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.881659 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.881663 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.881667 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.881671 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.881674 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.881678 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.881682 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.881685 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.881689 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.881693 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.881697 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.881701 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.881704 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.881708 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.881712 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.881715 | orchestrator | 2026-04-20 00:02:19.881719 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881723 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-20 00:02:19.881727 | orchestrator | } 2026-04-20 00:02:19.881730 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881734 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.881738 | orchestrator | } 2026-04-20 00:02:19.881741 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.881745 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-20 00:02:19.881749 | orchestrator | } 2026-04-20 00:02:19.881753 | orchestrator | 2026-04-20 00:02:19.881756 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.881760 | orchestrator | 2026-04-20 00:02:19.881764 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.881767 | orchestrator | + ip_address = "192.168.16.11" 2026-04-20 00:02:19.881771 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.881775 | orchestrator | } 2026-04-20 00:02:19.881779 | orchestrator | } 2026-04-20 00:02:19.881922 | orchestrator | 2026-04-20 00:02:19.881935 | orchestrator | # openstack_networking_port_v2.node_port_management[2] will be created 2026-04-20 00:02:19.881939 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-20 00:02:19.881943 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.881947 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.881950 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.881954 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.881962 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.881965 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.881969 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.881973 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.881979 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.881983 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.881987 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.881990 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.881994 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.881998 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.882001 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.882005 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.882009 | orchestrator | 2026-04-20 00:02:19.882028 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882032 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-20 00:02:19.882036 | orchestrator | } 2026-04-20 00:02:19.882040 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882044 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.882048 | orchestrator | } 2026-04-20 00:02:19.882051 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882055 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-20 00:02:19.882059 | orchestrator | } 2026-04-20 00:02:19.882062 | orchestrator | 2026-04-20 00:02:19.882066 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.882070 | orchestrator | 2026-04-20 00:02:19.882073 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.882077 | orchestrator | + ip_address = "192.168.16.12" 2026-04-20 00:02:19.882081 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.882085 | orchestrator | } 2026-04-20 00:02:19.882088 | orchestrator | } 2026-04-20 00:02:19.882234 | orchestrator | 2026-04-20 00:02:19.882247 | orchestrator | # openstack_networking_port_v2.node_port_management[3] will be created 2026-04-20 00:02:19.882252 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-20 00:02:19.882256 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.882260 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.882263 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.882267 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.882271 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.882275 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.882278 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.882282 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.882286 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.882289 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.882293 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.882297 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.882301 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.882304 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.882308 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.882312 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.882316 | orchestrator | 2026-04-20 00:02:19.882319 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882323 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-20 00:02:19.882327 | orchestrator | } 2026-04-20 00:02:19.882331 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882334 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.882338 | orchestrator | } 2026-04-20 00:02:19.882342 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882346 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-20 00:02:19.882349 | orchestrator | } 2026-04-20 00:02:19.882353 | orchestrator | 2026-04-20 00:02:19.882361 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.882365 | orchestrator | 2026-04-20 00:02:19.882368 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.882372 | orchestrator | + ip_address = "192.168.16.13" 2026-04-20 00:02:19.882376 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.882380 | orchestrator | } 2026-04-20 00:02:19.882383 | orchestrator | } 2026-04-20 00:02:19.882519 | orchestrator | 2026-04-20 00:02:19.882531 | orchestrator | # openstack_networking_port_v2.node_port_management[4] will be created 2026-04-20 00:02:19.882536 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-20 00:02:19.882540 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.882543 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.882547 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.882551 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.882555 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.882558 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.882562 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.882566 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.882569 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.882573 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.882577 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.882581 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.882584 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.882588 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.882592 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.882596 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.882600 | orchestrator | 2026-04-20 00:02:19.882604 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882608 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-20 00:02:19.882612 | orchestrator | } 2026-04-20 00:02:19.882615 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882619 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.882623 | orchestrator | } 2026-04-20 00:02:19.882626 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882630 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-20 00:02:19.882634 | orchestrator | } 2026-04-20 00:02:19.882638 | orchestrator | 2026-04-20 00:02:19.882641 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.882645 | orchestrator | 2026-04-20 00:02:19.882649 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.882652 | orchestrator | + ip_address = "192.168.16.14" 2026-04-20 00:02:19.882656 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.882660 | orchestrator | } 2026-04-20 00:02:19.882664 | orchestrator | } 2026-04-20 00:02:19.882790 | orchestrator | 2026-04-20 00:02:19.882801 | orchestrator | # openstack_networking_port_v2.node_port_management[5] will be created 2026-04-20 00:02:19.882806 | orchestrator | + resource "openstack_networking_port_v2" "node_port_management" { 2026-04-20 00:02:19.882810 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.882813 | orchestrator | + all_fixed_ips = (known after apply) 2026-04-20 00:02:19.882817 | orchestrator | + all_security_group_ids = (known after apply) 2026-04-20 00:02:19.882821 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.882825 | orchestrator | + device_id = (known after apply) 2026-04-20 00:02:19.882828 | orchestrator | + device_owner = (known after apply) 2026-04-20 00:02:19.882832 | orchestrator | + dns_assignment = (known after apply) 2026-04-20 00:02:19.882836 | orchestrator | + dns_name = (known after apply) 2026-04-20 00:02:19.882840 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.882843 | orchestrator | + mac_address = (known after apply) 2026-04-20 00:02:19.882847 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.882851 | orchestrator | + port_security_enabled = (known after apply) 2026-04-20 00:02:19.882855 | orchestrator | + qos_policy_id = (known after apply) 2026-04-20 00:02:19.882866 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.882870 | orchestrator | + security_group_ids = (known after apply) 2026-04-20 00:02:19.882873 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.882877 | orchestrator | 2026-04-20 00:02:19.882881 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882885 | orchestrator | + ip_address = "192.168.16.254/32" 2026-04-20 00:02:19.882888 | orchestrator | } 2026-04-20 00:02:19.882892 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882896 | orchestrator | + ip_address = "192.168.16.8/32" 2026-04-20 00:02:19.882899 | orchestrator | } 2026-04-20 00:02:19.882903 | orchestrator | + allowed_address_pairs { 2026-04-20 00:02:19.882907 | orchestrator | + ip_address = "192.168.16.9/32" 2026-04-20 00:02:19.882911 | orchestrator | } 2026-04-20 00:02:19.882914 | orchestrator | 2026-04-20 00:02:19.882921 | orchestrator | + binding (known after apply) 2026-04-20 00:02:19.882925 | orchestrator | 2026-04-20 00:02:19.882929 | orchestrator | + fixed_ip { 2026-04-20 00:02:19.882932 | orchestrator | + ip_address = "192.168.16.15" 2026-04-20 00:02:19.882936 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.882940 | orchestrator | } 2026-04-20 00:02:19.882943 | orchestrator | } 2026-04-20 00:02:19.882987 | orchestrator | 2026-04-20 00:02:19.882997 | orchestrator | # openstack_networking_router_interface_v2.router_interface will be created 2026-04-20 00:02:19.883002 | orchestrator | + resource "openstack_networking_router_interface_v2" "router_interface" { 2026-04-20 00:02:19.883006 | orchestrator | + force_destroy = false 2026-04-20 00:02:19.883010 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883013 | orchestrator | + port_id = (known after apply) 2026-04-20 00:02:19.883017 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883021 | orchestrator | + router_id = (known after apply) 2026-04-20 00:02:19.883024 | orchestrator | + subnet_id = (known after apply) 2026-04-20 00:02:19.883028 | orchestrator | } 2026-04-20 00:02:19.883106 | orchestrator | 2026-04-20 00:02:19.883117 | orchestrator | # openstack_networking_router_v2.router will be created 2026-04-20 00:02:19.883121 | orchestrator | + resource "openstack_networking_router_v2" "router" { 2026-04-20 00:02:19.883125 | orchestrator | + admin_state_up = (known after apply) 2026-04-20 00:02:19.883129 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.883133 | orchestrator | + availability_zone_hints = [ 2026-04-20 00:02:19.883137 | orchestrator | + "nova", 2026-04-20 00:02:19.883140 | orchestrator | ] 2026-04-20 00:02:19.883144 | orchestrator | + distributed = (known after apply) 2026-04-20 00:02:19.883148 | orchestrator | + enable_snat = (known after apply) 2026-04-20 00:02:19.883152 | orchestrator | + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2026-04-20 00:02:19.883155 | orchestrator | + external_qos_policy_id = (known after apply) 2026-04-20 00:02:19.883159 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883163 | orchestrator | + name = "testbed" 2026-04-20 00:02:19.883167 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883170 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.883174 | orchestrator | 2026-04-20 00:02:19.883178 | orchestrator | + external_fixed_ip (known after apply) 2026-04-20 00:02:19.883182 | orchestrator | } 2026-04-20 00:02:19.883273 | orchestrator | 2026-04-20 00:02:19.883285 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2026-04-20 00:02:19.883291 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2026-04-20 00:02:19.883294 | orchestrator | + description = "ssh" 2026-04-20 00:02:19.883298 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.883302 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.883306 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883310 | orchestrator | + port_range_max = 22 2026-04-20 00:02:19.883313 | orchestrator | + port_range_min = 22 2026-04-20 00:02:19.883317 | orchestrator | + protocol = "tcp" 2026-04-20 00:02:19.883321 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883329 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.883332 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.883336 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.883340 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.883344 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.883347 | orchestrator | } 2026-04-20 00:02:19.883427 | orchestrator | 2026-04-20 00:02:19.883438 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2026-04-20 00:02:19.883443 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2026-04-20 00:02:19.883446 | orchestrator | + description = "wireguard" 2026-04-20 00:02:19.883450 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.883454 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.883458 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883461 | orchestrator | + port_range_max = 51820 2026-04-20 00:02:19.883465 | orchestrator | + port_range_min = 51820 2026-04-20 00:02:19.883469 | orchestrator | + protocol = "udp" 2026-04-20 00:02:19.883472 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883476 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.883480 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.883483 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.883487 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.883491 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.883495 | orchestrator | } 2026-04-20 00:02:19.883555 | orchestrator | 2026-04-20 00:02:19.883566 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2026-04-20 00:02:19.883571 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2026-04-20 00:02:19.883574 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.883578 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.883582 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883586 | orchestrator | + protocol = "tcp" 2026-04-20 00:02:19.883589 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883593 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.883597 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.883601 | orchestrator | + remote_ip_prefix = "192.168.16.0/20" 2026-04-20 00:02:19.883604 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.883608 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.883612 | orchestrator | } 2026-04-20 00:02:19.883669 | orchestrator | 2026-04-20 00:02:19.883680 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2026-04-20 00:02:19.883684 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2026-04-20 00:02:19.883688 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.883692 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.883695 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883699 | orchestrator | + protocol = "udp" 2026-04-20 00:02:19.883703 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883707 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.883710 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.883714 | orchestrator | + remote_ip_prefix = "192.168.16.0/20" 2026-04-20 00:02:19.883718 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.883722 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.883725 | orchestrator | } 2026-04-20 00:02:19.883782 | orchestrator | 2026-04-20 00:02:19.883793 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2026-04-20 00:02:19.883867 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2026-04-20 00:02:19.883871 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.883875 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.883879 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.883883 | orchestrator | + protocol = "icmp" 2026-04-20 00:02:19.883887 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.883890 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.883894 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.883898 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.883902 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.883905 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.883909 | orchestrator | } 2026-04-20 00:02:19.883972 | orchestrator | 2026-04-20 00:02:19.883984 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2026-04-20 00:02:19.883989 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2026-04-20 00:02:19.883992 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.883996 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.884000 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884004 | orchestrator | + protocol = "tcp" 2026-04-20 00:02:19.884008 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884012 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.884018 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.884022 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.884025 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.884029 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884033 | orchestrator | } 2026-04-20 00:02:19.884090 | orchestrator | 2026-04-20 00:02:19.884101 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2026-04-20 00:02:19.884105 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2026-04-20 00:02:19.884109 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.884113 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.884116 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884120 | orchestrator | + protocol = "udp" 2026-04-20 00:02:19.884124 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884128 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.884132 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.884135 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.884139 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.884143 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884147 | orchestrator | } 2026-04-20 00:02:19.884205 | orchestrator | 2026-04-20 00:02:19.884248 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2026-04-20 00:02:19.884253 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2026-04-20 00:02:19.884256 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.884263 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.884267 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884271 | orchestrator | + protocol = "icmp" 2026-04-20 00:02:19.884275 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884278 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.884282 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.884286 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.884289 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.884293 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884302 | orchestrator | } 2026-04-20 00:02:19.884365 | orchestrator | 2026-04-20 00:02:19.884376 | orchestrator | # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2026-04-20 00:02:19.884381 | orchestrator | + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2026-04-20 00:02:19.884385 | orchestrator | + description = "vrrp" 2026-04-20 00:02:19.884388 | orchestrator | + direction = "ingress" 2026-04-20 00:02:19.884392 | orchestrator | + ethertype = "IPv4" 2026-04-20 00:02:19.884396 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884399 | orchestrator | + protocol = "112" 2026-04-20 00:02:19.884403 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884407 | orchestrator | + remote_address_group_id = (known after apply) 2026-04-20 00:02:19.884410 | orchestrator | + remote_group_id = (known after apply) 2026-04-20 00:02:19.884414 | orchestrator | + remote_ip_prefix = "0.0.0.0/0" 2026-04-20 00:02:19.884418 | orchestrator | + security_group_id = (known after apply) 2026-04-20 00:02:19.884422 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884425 | orchestrator | } 2026-04-20 00:02:19.884469 | orchestrator | 2026-04-20 00:02:19.884480 | orchestrator | # openstack_networking_secgroup_v2.security_group_management will be created 2026-04-20 00:02:19.884484 | orchestrator | + resource "openstack_networking_secgroup_v2" "security_group_management" { 2026-04-20 00:02:19.884488 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.884492 | orchestrator | + description = "management security group" 2026-04-20 00:02:19.884496 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884499 | orchestrator | + name = "testbed-management" 2026-04-20 00:02:19.884503 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884507 | orchestrator | + stateful = (known after apply) 2026-04-20 00:02:19.884511 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884514 | orchestrator | } 2026-04-20 00:02:19.884559 | orchestrator | 2026-04-20 00:02:19.884570 | orchestrator | # openstack_networking_secgroup_v2.security_group_node will be created 2026-04-20 00:02:19.884575 | orchestrator | + resource "openstack_networking_secgroup_v2" "security_group_node" { 2026-04-20 00:02:19.884578 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.884582 | orchestrator | + description = "node security group" 2026-04-20 00:02:19.884586 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884589 | orchestrator | + name = "testbed-node" 2026-04-20 00:02:19.884593 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884597 | orchestrator | + stateful = (known after apply) 2026-04-20 00:02:19.884600 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884604 | orchestrator | } 2026-04-20 00:02:19.884706 | orchestrator | 2026-04-20 00:02:19.884718 | orchestrator | # openstack_networking_subnet_v2.subnet_management will be created 2026-04-20 00:02:19.884722 | orchestrator | + resource "openstack_networking_subnet_v2" "subnet_management" { 2026-04-20 00:02:19.884726 | orchestrator | + all_tags = (known after apply) 2026-04-20 00:02:19.884730 | orchestrator | + cidr = "192.168.16.0/20" 2026-04-20 00:02:19.884733 | orchestrator | + dns_nameservers = [ 2026-04-20 00:02:19.884737 | orchestrator | + "8.8.8.8", 2026-04-20 00:02:19.884741 | orchestrator | + "9.9.9.9", 2026-04-20 00:02:19.884745 | orchestrator | ] 2026-04-20 00:02:19.884748 | orchestrator | + enable_dhcp = true 2026-04-20 00:02:19.884752 | orchestrator | + gateway_ip = (known after apply) 2026-04-20 00:02:19.884756 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884760 | orchestrator | + ip_version = 4 2026-04-20 00:02:19.884763 | orchestrator | + ipv6_address_mode = (known after apply) 2026-04-20 00:02:19.884767 | orchestrator | + ipv6_ra_mode = (known after apply) 2026-04-20 00:02:19.884771 | orchestrator | + name = "subnet-testbed-management" 2026-04-20 00:02:19.884774 | orchestrator | + network_id = (known after apply) 2026-04-20 00:02:19.884778 | orchestrator | + no_gateway = false 2026-04-20 00:02:19.884782 | orchestrator | + region = (known after apply) 2026-04-20 00:02:19.884785 | orchestrator | + service_types = (known after apply) 2026-04-20 00:02:19.884793 | orchestrator | + tenant_id = (known after apply) 2026-04-20 00:02:19.884797 | orchestrator | 2026-04-20 00:02:19.884801 | orchestrator | + allocation_pool { 2026-04-20 00:02:19.884804 | orchestrator | + end = "192.168.31.250" 2026-04-20 00:02:19.884808 | orchestrator | + start = "192.168.31.200" 2026-04-20 00:02:19.884812 | orchestrator | } 2026-04-20 00:02:19.884815 | orchestrator | } 2026-04-20 00:02:19.884846 | orchestrator | 2026-04-20 00:02:19.884857 | orchestrator | # terraform_data.image will be created 2026-04-20 00:02:19.884861 | orchestrator | + resource "terraform_data" "image" { 2026-04-20 00:02:19.884865 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884869 | orchestrator | + input = "Ubuntu 24.04" 2026-04-20 00:02:19.884873 | orchestrator | + output = (known after apply) 2026-04-20 00:02:19.884876 | orchestrator | } 2026-04-20 00:02:19.884906 | orchestrator | 2026-04-20 00:02:19.884916 | orchestrator | # terraform_data.image_node will be created 2026-04-20 00:02:19.884920 | orchestrator | + resource "terraform_data" "image_node" { 2026-04-20 00:02:19.884924 | orchestrator | + id = (known after apply) 2026-04-20 00:02:19.884928 | orchestrator | + input = "Ubuntu 24.04" 2026-04-20 00:02:19.884932 | orchestrator | + output = (known after apply) 2026-04-20 00:02:19.884935 | orchestrator | } 2026-04-20 00:02:19.884950 | orchestrator | 2026-04-20 00:02:19.884954 | orchestrator | Plan: 64 to add, 0 to change, 0 to destroy. 2026-04-20 00:02:19.884965 | orchestrator | 2026-04-20 00:02:19.884969 | orchestrator | Changes to Outputs: 2026-04-20 00:02:19.884979 | orchestrator | + manager_address = (sensitive value) 2026-04-20 00:02:19.884983 | orchestrator | + private_key = (sensitive value) 2026-04-20 00:02:20.027819 | orchestrator | terraform_data.image_node: Creating... 2026-04-20 00:02:20.028016 | orchestrator | terraform_data.image_node: Creation complete after 0s [id=72b46bc5-0abc-3810-a934-7edfece69deb] 2026-04-20 00:02:20.079842 | orchestrator | terraform_data.image: Creating... 2026-04-20 00:02:20.080149 | orchestrator | terraform_data.image: Creation complete after 0s [id=d33111b8-b448-c445-1072-59a5a333a024] 2026-04-20 00:02:20.105407 | orchestrator | data.openstack_images_image_v2.image: Reading... 2026-04-20 00:02:20.110001 | orchestrator | data.openstack_images_image_v2.image_node: Reading... 2026-04-20 00:02:20.114390 | orchestrator | openstack_compute_keypair_v2.key: Creating... 2026-04-20 00:02:20.114629 | orchestrator | openstack_networking_network_v2.net_management: Creating... 2026-04-20 00:02:20.116813 | orchestrator | openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2026-04-20 00:02:20.118639 | orchestrator | openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2026-04-20 00:02:20.121527 | orchestrator | openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2026-04-20 00:02:20.121750 | orchestrator | openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2026-04-20 00:02:20.127489 | orchestrator | openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2026-04-20 00:02:20.127616 | orchestrator | openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2026-04-20 00:02:20.584446 | orchestrator | data.openstack_images_image_v2.image_node: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2026-04-20 00:02:20.593438 | orchestrator | openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2026-04-20 00:02:20.675264 | orchestrator | openstack_compute_keypair_v2.key: Creation complete after 1s [id=testbed] 2026-04-20 00:02:20.679694 | orchestrator | openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2026-04-20 00:02:20.894190 | orchestrator | data.openstack_images_image_v2.image: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2026-04-20 00:02:20.902265 | orchestrator | openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2026-04-20 00:02:21.191190 | orchestrator | openstack_networking_network_v2.net_management: Creation complete after 1s [id=3564d677-8def-4be6-8b8e-9ee2e3d8076c] 2026-04-20 00:02:21.196112 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2026-04-20 00:02:23.767521 | orchestrator | openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 4s [id=bdcbd50e-fc40-4173-bc88-351fd741a560] 2026-04-20 00:02:23.782838 | orchestrator | local_file.id_rsa_pub: Creating... 2026-04-20 00:02:23.788489 | orchestrator | local_file.id_rsa_pub: Creation complete after 0s [id=90530c2d0db835bccb4cc5c67b7ba32755375979] 2026-04-20 00:02:23.799436 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2026-04-20 00:02:23.806858 | orchestrator | openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 4s [id=6f84c887-ba73-482f-a41f-d5b1a59c2e3c] 2026-04-20 00:02:23.814924 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2026-04-20 00:02:23.822648 | orchestrator | openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 4s [id=6895d0f2-ba69-41e1-a4cc-d0f527389fe4] 2026-04-20 00:02:23.826418 | orchestrator | openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 4s [id=bb585aa1-11e8-43ef-a761-9431875b84d1] 2026-04-20 00:02:23.830442 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2026-04-20 00:02:23.833578 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2026-04-20 00:02:23.843093 | orchestrator | openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 4s [id=4d9b431e-9b52-486b-bddb-3e9e0ee5fa39] 2026-04-20 00:02:23.846821 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2026-04-20 00:02:23.876716 | orchestrator | openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 3s [id=0c844390-ddcc-47db-87c2-e0ad3f299f11] 2026-04-20 00:02:23.878376 | orchestrator | openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 4s [id=71e5e2fe-8079-44a9-83c9-718c1a37ec11] 2026-04-20 00:02:23.885968 | orchestrator | local_sensitive_file.id_rsa: Creating... 2026-04-20 00:02:23.888797 | orchestrator | openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2026-04-20 00:02:23.893546 | orchestrator | local_sensitive_file.id_rsa: Creation complete after 0s [id=2b121fae02c720fe196b6860f0f001e02a16a89a] 2026-04-20 00:02:23.898337 | orchestrator | openstack_networking_subnet_v2.subnet_management: Creating... 2026-04-20 00:02:24.042178 | orchestrator | openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 3s [id=0604a395-fc8c-4060-a9f6-9fb568501435] 2026-04-20 00:02:24.121063 | orchestrator | openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 3s [id=9b7f1cab-7403-4991-80fd-9e18e6faf85e] 2026-04-20 00:02:24.559250 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 4s [id=8a57eed1-8d6f-4860-8edf-ab5651bf3501] 2026-04-20 00:02:24.870574 | orchestrator | openstack_networking_subnet_v2.subnet_management: Creation complete after 1s [id=6f4cf651-1578-45e3-97c7-2a2a54ea416e] 2026-04-20 00:02:24.880466 | orchestrator | openstack_networking_router_v2.router: Creating... 2026-04-20 00:02:27.204560 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 3s [id=0ddb617a-526c-421f-a511-7cc1055ebfef] 2026-04-20 00:02:27.269253 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 3s [id=94a87711-1bba-4ac5-aa91-62925126bc5a] 2026-04-20 00:02:27.298410 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 3s [id=febc5b66-3851-48e5-b18a-64e71ac34203] 2026-04-20 00:02:27.340048 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 3s [id=8a9991d5-8e83-4951-b0c2-d6541434356e] 2026-04-20 00:02:27.361793 | orchestrator | openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 3s [id=460d9951-2554-46b6-90f0-f6ed415ae217] 2026-04-20 00:02:27.378847 | orchestrator | openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 3s [id=abc91533-fafb-4291-911d-be538a80553e] 2026-04-20 00:02:28.846169 | orchestrator | openstack_networking_router_v2.router: Creation complete after 4s [id=a5268d52-aec6-4ae8-89dc-5840e0789fac] 2026-04-20 00:02:28.854825 | orchestrator | openstack_networking_secgroup_v2.security_group_node: Creating... 2026-04-20 00:02:28.855372 | orchestrator | openstack_networking_secgroup_v2.security_group_management: Creating... 2026-04-20 00:02:28.858685 | orchestrator | openstack_networking_router_interface_v2.router_interface: Creating... 2026-04-20 00:02:29.085518 | orchestrator | openstack_networking_secgroup_v2.security_group_management: Creation complete after 0s [id=d0b7dc5a-002f-4fd5-813d-b04be0125331] 2026-04-20 00:02:29.491793 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2026-04-20 00:02:29.491908 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2026-04-20 00:02:29.491926 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2026-04-20 00:02:29.491966 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2026-04-20 00:02:29.491978 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2026-04-20 00:02:29.491990 | orchestrator | openstack_networking_port_v2.manager_port_management: Creating... 2026-04-20 00:02:29.492001 | orchestrator | openstack_networking_secgroup_v2.security_group_node: Creation complete after 0s [id=25a86a3a-7bdc-4f34-ae7e-8fc750e22ffc] 2026-04-20 00:02:29.492070 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2026-04-20 00:02:29.492084 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2026-04-20 00:02:29.492095 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2026-04-20 00:02:29.492145 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 0s [id=cf8a0a41-78ce-4f58-83c9-8cdcd7669a00] 2026-04-20 00:02:29.492160 | orchestrator | openstack_networking_port_v2.node_port_management[3]: Creating... 2026-04-20 00:02:29.627316 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 1s [id=29b6dc14-046d-466a-9486-b5afa44099d5] 2026-04-20 00:02:29.640372 | orchestrator | openstack_networking_port_v2.node_port_management[5]: Creating... 2026-04-20 00:02:29.914498 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 1s [id=e18d1f1e-fb35-418d-951a-5bdc121d9ebe] 2026-04-20 00:02:29.924446 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2026-04-20 00:02:29.996690 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 1s [id=83ce4efc-0ccb-4f4f-a65e-e33f5e25ecc4] 2026-04-20 00:02:30.008433 | orchestrator | openstack_networking_port_v2.node_port_management[1]: Creating... 2026-04-20 00:02:30.325315 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 1s [id=31ded242-2c19-46cb-87a8-2b6a22d28b87] 2026-04-20 00:02:30.330618 | orchestrator | openstack_networking_port_v2.node_port_management[4]: Creating... 2026-04-20 00:02:30.336339 | orchestrator | openstack_networking_port_v2.manager_port_management: Creation complete after 1s [id=f7434a63-67fb-4940-b5f7-5a049f724b58] 2026-04-20 00:02:30.346477 | orchestrator | openstack_networking_port_v2.node_port_management[2]: Creating... 2026-04-20 00:02:30.366626 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 1s [id=8e779b17-901e-4c74-b8e4-06bd0fcea49d] 2026-04-20 00:02:30.374075 | orchestrator | openstack_networking_port_v2.node_port_management[0]: Creating... 2026-04-20 00:02:30.775007 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 2s [id=ae73e455-2c44-4d27-bbea-7f6e07740a51] 2026-04-20 00:02:30.844592 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 1s [id=128f69ff-33cb-4951-8d95-724c3ac6ff2b] 2026-04-20 00:02:31.108755 | orchestrator | openstack_networking_port_v2.node_port_management[1]: Creation complete after 1s [id=e7df36ab-7f7c-40e1-853b-d0174b1437d8] 2026-04-20 00:02:31.346872 | orchestrator | openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 2s [id=e0d39450-f4c1-438b-a5bb-2519b7484cd7] 2026-04-20 00:02:31.403631 | orchestrator | openstack_networking_port_v2.node_port_management[5]: Creation complete after 1s [id=c0d7958f-2972-4e3f-8f45-e999644762f9] 2026-04-20 00:02:31.443023 | orchestrator | openstack_networking_port_v2.node_port_management[2]: Creation complete after 1s [id=1c06c1ee-93ce-4ff1-82ee-3291ad413150] 2026-04-20 00:02:31.766751 | orchestrator | openstack_networking_port_v2.node_port_management[0]: Creation complete after 2s [id=daf2c5d3-02cd-409f-9ff4-c6c9c274c883] 2026-04-20 00:02:31.873911 | orchestrator | openstack_networking_port_v2.node_port_management[3]: Creation complete after 3s [id=81ada3bd-3822-4298-9fc6-0d2d7af84663] 2026-04-20 00:02:32.160926 | orchestrator | openstack_networking_port_v2.node_port_management[4]: Creation complete after 2s [id=b0707f86-5249-430c-90d5-219ddad6e3af] 2026-04-20 00:02:34.405907 | orchestrator | openstack_networking_router_interface_v2.router_interface: Creation complete after 5s [id=5d4264d2-d00b-4f0a-8bd0-652de8c9a3a1] 2026-04-20 00:02:34.418644 | orchestrator | openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2026-04-20 00:02:34.441927 | orchestrator | openstack_compute_instance_v2.node_server[5]: Creating... 2026-04-20 00:02:34.442642 | orchestrator | openstack_compute_instance_v2.node_server[3]: Creating... 2026-04-20 00:02:34.444696 | orchestrator | openstack_compute_instance_v2.node_server[1]: Creating... 2026-04-20 00:02:34.459329 | orchestrator | openstack_compute_instance_v2.node_server[4]: Creating... 2026-04-20 00:02:34.459878 | orchestrator | openstack_compute_instance_v2.node_server[0]: Creating... 2026-04-20 00:02:34.461557 | orchestrator | openstack_compute_instance_v2.node_server[2]: Creating... 2026-04-20 00:02:36.224344 | orchestrator | openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 2s [id=13008940-7e25-4562-919c-dd11ea7d0cc7] 2026-04-20 00:02:36.232963 | orchestrator | openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2026-04-20 00:02:36.240113 | orchestrator | local_file.MANAGER_ADDRESS: Creating... 2026-04-20 00:02:36.242299 | orchestrator | local_file.inventory: Creating... 2026-04-20 00:02:36.247323 | orchestrator | local_file.MANAGER_ADDRESS: Creation complete after 0s [id=38ef3445ad825a73783e3b14326d0ca69a1d6434] 2026-04-20 00:02:36.247994 | orchestrator | local_file.inventory: Creation complete after 0s [id=bfe8af08ce652051aa62211a08e409b7e709d9f9] 2026-04-20 00:02:37.191690 | orchestrator | openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 1s [id=13008940-7e25-4562-919c-dd11ea7d0cc7] 2026-04-20 00:02:44.443484 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2026-04-20 00:02:44.443610 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2026-04-20 00:02:44.447106 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2026-04-20 00:02:44.462559 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2026-04-20 00:02:44.462646 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2026-04-20 00:02:44.462659 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2026-04-20 00:02:54.453252 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2026-04-20 00:02:54.453380 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2026-04-20 00:02:54.453396 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2026-04-20 00:02:54.462851 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2026-04-20 00:02:54.462927 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2026-04-20 00:02:54.462946 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2026-04-20 00:03:04.462152 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [30s elapsed] 2026-04-20 00:03:04.462287 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [30s elapsed] 2026-04-20 00:03:04.462299 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [30s elapsed] 2026-04-20 00:03:04.463563 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [30s elapsed] 2026-04-20 00:03:04.463599 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [30s elapsed] 2026-04-20 00:03:04.463607 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2026-04-20 00:03:14.471144 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [40s elapsed] 2026-04-20 00:03:14.471298 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [40s elapsed] 2026-04-20 00:03:14.471309 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [40s elapsed] 2026-04-20 00:03:14.471317 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [40s elapsed] 2026-04-20 00:03:14.471324 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [40s elapsed] 2026-04-20 00:03:14.471331 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [40s elapsed] 2026-04-20 00:03:24.480520 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [50s elapsed] 2026-04-20 00:03:24.480625 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [50s elapsed] 2026-04-20 00:03:24.480633 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [50s elapsed] 2026-04-20 00:03:24.480646 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [50s elapsed] 2026-04-20 00:03:24.480651 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [50s elapsed] 2026-04-20 00:03:24.480655 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [50s elapsed] 2026-04-20 00:03:34.489219 | orchestrator | openstack_compute_instance_v2.node_server[1]: Still creating... [1m0s elapsed] 2026-04-20 00:03:34.489306 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [1m0s elapsed] 2026-04-20 00:03:34.489311 | orchestrator | openstack_compute_instance_v2.node_server[0]: Still creating... [1m0s elapsed] 2026-04-20 00:03:34.489316 | orchestrator | openstack_compute_instance_v2.node_server[4]: Still creating... [1m0s elapsed] 2026-04-20 00:03:34.489320 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [1m0s elapsed] 2026-04-20 00:03:34.489324 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [1m0s elapsed] 2026-04-20 00:03:35.125096 | orchestrator | openstack_compute_instance_v2.node_server[1]: Creation complete after 1m1s [id=bea271a1-6eda-4aa8-b5d7-7463a8a8544e] 2026-04-20 00:03:35.460088 | orchestrator | openstack_compute_instance_v2.node_server[0]: Creation complete after 1m1s [id=4a8a016b-a0ee-4234-8aa7-8aa4be903b02] 2026-04-20 00:03:35.552009 | orchestrator | openstack_compute_instance_v2.node_server[4]: Creation complete after 1m2s [id=2070b4eb-afa6-4614-954e-088b6291d13b] 2026-04-20 00:03:44.494346 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [1m10s elapsed] 2026-04-20 00:03:44.494462 | orchestrator | openstack_compute_instance_v2.node_server[2]: Still creating... [1m10s elapsed] 2026-04-20 00:03:44.494476 | orchestrator | openstack_compute_instance_v2.node_server[5]: Still creating... [1m10s elapsed] 2026-04-20 00:03:45.429234 | orchestrator | openstack_compute_instance_v2.node_server[2]: Creation complete after 1m11s [id=e90d6efa-e1ff-4b38-a0eb-943aa53634c2] 2026-04-20 00:03:45.456831 | orchestrator | openstack_compute_instance_v2.node_server[5]: Creation complete after 1m11s [id=96d2a48b-b301-4923-b048-a8b86e5e9722] 2026-04-20 00:03:54.494984 | orchestrator | openstack_compute_instance_v2.node_server[3]: Still creating... [1m20s elapsed] 2026-04-20 00:03:56.055643 | orchestrator | openstack_compute_instance_v2.node_server[3]: Creation complete after 1m22s [id=582ca1eb-c653-47c0-841d-5a58c4ee6fa8] 2026-04-20 00:03:56.107593 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2026-04-20 00:03:56.108621 | orchestrator | null_resource.node_semaphore: Creating... 2026-04-20 00:03:56.109374 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2026-04-20 00:03:56.114074 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2026-04-20 00:03:56.114710 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2026-04-20 00:03:56.119259 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2026-04-20 00:03:56.120289 | orchestrator | null_resource.node_semaphore: Creation complete after 0s [id=1913299819879935877] 2026-04-20 00:03:56.125085 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2026-04-20 00:03:56.125963 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2026-04-20 00:03:56.128276 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2026-04-20 00:03:56.130173 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2026-04-20 00:03:56.150464 | orchestrator | openstack_compute_instance_v2.manager_server: Creating... 2026-04-20 00:03:59.530444 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 4s [id=96d2a48b-b301-4923-b048-a8b86e5e9722/6895d0f2-ba69-41e1-a4cc-d0f527389fe4] 2026-04-20 00:03:59.548019 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 4s [id=582ca1eb-c653-47c0-841d-5a58c4ee6fa8/4d9b431e-9b52-486b-bddb-3e9e0ee5fa39] 2026-04-20 00:03:59.561664 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 4s [id=2070b4eb-afa6-4614-954e-088b6291d13b/0604a395-fc8c-4060-a9f6-9fb568501435] 2026-04-20 00:03:59.608476 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 4s [id=96d2a48b-b301-4923-b048-a8b86e5e9722/bb585aa1-11e8-43ef-a761-9431875b84d1] 2026-04-20 00:03:59.614753 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 4s [id=2070b4eb-afa6-4614-954e-088b6291d13b/9b7f1cab-7403-4991-80fd-9e18e6faf85e] 2026-04-20 00:03:59.635165 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 4s [id=582ca1eb-c653-47c0-841d-5a58c4ee6fa8/0c844390-ddcc-47db-87c2-e0ad3f299f11] 2026-04-20 00:04:05.725210 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 10s [id=582ca1eb-c653-47c0-841d-5a58c4ee6fa8/71e5e2fe-8079-44a9-83c9-718c1a37ec11] 2026-04-20 00:04:05.757273 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 10s [id=96d2a48b-b301-4923-b048-a8b86e5e9722/bdcbd50e-fc40-4173-bc88-351fd741a560] 2026-04-20 00:04:05.917799 | orchestrator | openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 10s [id=2070b4eb-afa6-4614-954e-088b6291d13b/6f84c887-ba73-482f-a41f-d5b1a59c2e3c] 2026-04-20 00:04:06.150689 | orchestrator | openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2026-04-20 00:04:16.151513 | orchestrator | openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2026-04-20 00:04:16.567624 | orchestrator | openstack_compute_instance_v2.manager_server: Creation complete after 21s [id=00b41973-cccc-4bda-b096-18c2d6e2e3e6] 2026-04-20 00:04:16.598424 | orchestrator | 2026-04-20 00:04:16.598491 | orchestrator | Apply complete! Resources: 64 added, 0 changed, 0 destroyed. 2026-04-20 00:04:16.598536 | orchestrator | 2026-04-20 00:04:16.598549 | orchestrator | Outputs: 2026-04-20 00:04:16.598557 | orchestrator | 2026-04-20 00:04:16.598583 | orchestrator | manager_address = 2026-04-20 00:04:16.598592 | orchestrator | private_key = 2026-04-20 00:04:17.081283 | orchestrator | ok: Runtime: 0:02:03.093208 2026-04-20 00:04:17.116673 | 2026-04-20 00:04:17.116813 | TASK [Fetch manager address] 2026-04-20 00:04:17.620652 | orchestrator | ok 2026-04-20 00:04:17.628852 | 2026-04-20 00:04:17.628987 | TASK [Set manager_host address] 2026-04-20 00:04:17.711500 | orchestrator | ok 2026-04-20 00:04:17.719343 | 2026-04-20 00:04:17.719468 | LOOP [Update ansible collections] 2026-04-20 00:04:18.806270 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-20 00:04:18.807240 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2026-04-20 00:04:18.807423 | orchestrator | Starting galaxy collection install process 2026-04-20 00:04:18.807491 | orchestrator | Process install dependency map 2026-04-20 00:04:18.807531 | orchestrator | Starting collection install process 2026-04-20 00:04:18.807565 | orchestrator | Installing 'osism.commons:999.0.0' to '/home/zuul-testbed03/.ansible/collections/ansible_collections/osism/commons' 2026-04-20 00:04:18.807607 | orchestrator | Created collection for osism.commons:999.0.0 at /home/zuul-testbed03/.ansible/collections/ansible_collections/osism/commons 2026-04-20 00:04:18.807648 | orchestrator | osism.commons:999.0.0 was installed successfully 2026-04-20 00:04:18.807729 | orchestrator | ok: Item: commons Runtime: 0:00:00.649182 2026-04-20 00:04:19.955024 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-20 00:04:19.955198 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2026-04-20 00:04:19.955244 | orchestrator | Starting galaxy collection install process 2026-04-20 00:04:19.955280 | orchestrator | Process install dependency map 2026-04-20 00:04:19.955315 | orchestrator | Starting collection install process 2026-04-20 00:04:19.955347 | orchestrator | Installing 'osism.services:999.0.0' to '/home/zuul-testbed03/.ansible/collections/ansible_collections/osism/services' 2026-04-20 00:04:19.955378 | orchestrator | Created collection for osism.services:999.0.0 at /home/zuul-testbed03/.ansible/collections/ansible_collections/osism/services 2026-04-20 00:04:19.955409 | orchestrator | osism.services:999.0.0 was installed successfully 2026-04-20 00:04:19.955459 | orchestrator | ok: Item: services Runtime: 0:00:00.843893 2026-04-20 00:04:19.972460 | 2026-04-20 00:04:19.972644 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2026-04-20 00:04:30.603913 | orchestrator | ok 2026-04-20 00:04:30.617705 | 2026-04-20 00:04:30.617864 | TASK [Wait a little longer for the manager so that everything is ready] 2026-04-20 00:05:30.661228 | orchestrator | ok 2026-04-20 00:05:30.671737 | 2026-04-20 00:05:30.671977 | TASK [Fetch manager ssh hostkey] 2026-04-20 00:05:32.259712 | orchestrator | Output suppressed because no_log was given 2026-04-20 00:05:32.273290 | 2026-04-20 00:05:32.273505 | TASK [Get ssh keypair from terraform environment] 2026-04-20 00:05:32.814650 | orchestrator | ok: Runtime: 0:00:00.014326 2026-04-20 00:05:32.827534 | 2026-04-20 00:05:32.827692 | TASK [Point out that the following task takes some time and does not give any output] 2026-04-20 00:05:32.876241 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2026-04-20 00:05:32.886264 | 2026-04-20 00:05:32.886399 | TASK [Run manager part 0] 2026-04-20 00:05:34.041779 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-20 00:05:34.103445 | orchestrator | 2026-04-20 00:05:34.103486 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2026-04-20 00:05:34.103493 | orchestrator | 2026-04-20 00:05:34.103505 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2026-04-20 00:05:38.135014 | orchestrator | ok: [testbed-manager] 2026-04-20 00:05:38.135064 | orchestrator | 2026-04-20 00:05:38.135084 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2026-04-20 00:05:38.135094 | orchestrator | 2026-04-20 00:05:38.135120 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:05:40.082288 | orchestrator | ok: [testbed-manager] 2026-04-20 00:05:40.082362 | orchestrator | 2026-04-20 00:05:40.082369 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2026-04-20 00:05:40.721204 | orchestrator | ok: [testbed-manager] 2026-04-20 00:05:40.721256 | orchestrator | 2026-04-20 00:05:40.721264 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2026-04-20 00:05:40.767925 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:05:40.767974 | orchestrator | 2026-04-20 00:05:40.767983 | orchestrator | TASK [Fail if Ubuntu version is lower than 24.04] ****************************** 2026-04-20 00:05:40.803934 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:05:40.803992 | orchestrator | 2026-04-20 00:05:40.804004 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2026-04-20 00:05:40.841609 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:05:40.841674 | orchestrator | 2026-04-20 00:05:40.841683 | orchestrator | TASK [Set APT options on manager] ********************************************** 2026-04-20 00:05:41.611168 | orchestrator | changed: [testbed-manager] 2026-04-20 00:05:41.611306 | orchestrator | 2026-04-20 00:05:41.611315 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2026-04-20 00:08:54.170744 | orchestrator | changed: [testbed-manager] 2026-04-20 00:08:54.170806 | orchestrator | 2026-04-20 00:08:54.170819 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2026-04-20 00:10:16.285916 | orchestrator | changed: [testbed-manager] 2026-04-20 00:10:16.286066 | orchestrator | 2026-04-20 00:10:16.286093 | orchestrator | TASK [Install required packages] *********************************************** 2026-04-20 00:10:38.923640 | orchestrator | changed: [testbed-manager] 2026-04-20 00:10:38.923745 | orchestrator | 2026-04-20 00:10:38.923764 | orchestrator | TASK [Remove some python packages] ********************************************* 2026-04-20 00:10:47.666205 | orchestrator | changed: [testbed-manager] 2026-04-20 00:10:47.666302 | orchestrator | 2026-04-20 00:10:47.666320 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2026-04-20 00:10:47.711448 | orchestrator | ok: [testbed-manager] 2026-04-20 00:10:47.711497 | orchestrator | 2026-04-20 00:10:47.711510 | orchestrator | TASK [Get current user] ******************************************************** 2026-04-20 00:10:48.544485 | orchestrator | ok: [testbed-manager] 2026-04-20 00:10:48.544594 | orchestrator | 2026-04-20 00:10:48.544605 | orchestrator | TASK [Create venv directory] *************************************************** 2026-04-20 00:10:49.304716 | orchestrator | changed: [testbed-manager] 2026-04-20 00:10:49.304962 | orchestrator | 2026-04-20 00:10:49.304981 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2026-04-20 00:10:55.657459 | orchestrator | changed: [testbed-manager] 2026-04-20 00:10:55.657571 | orchestrator | 2026-04-20 00:10:55.657588 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2026-04-20 00:11:01.121237 | orchestrator | changed: [testbed-manager] 2026-04-20 00:11:01.121325 | orchestrator | 2026-04-20 00:11:01.121342 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2026-04-20 00:11:03.723824 | orchestrator | changed: [testbed-manager] 2026-04-20 00:11:03.723900 | orchestrator | 2026-04-20 00:11:03.723913 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2026-04-20 00:11:05.429441 | orchestrator | changed: [testbed-manager] 2026-04-20 00:11:05.429552 | orchestrator | 2026-04-20 00:11:05.429570 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2026-04-20 00:11:06.534338 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2026-04-20 00:11:06.534456 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2026-04-20 00:11:06.534472 | orchestrator | 2026-04-20 00:11:06.534490 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2026-04-20 00:11:06.578335 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2026-04-20 00:11:06.578384 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2026-04-20 00:11:06.578389 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2026-04-20 00:11:06.578396 | orchestrator | deprecation_warnings=False in ansible.cfg. 2026-04-20 00:11:14.467517 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2026-04-20 00:11:14.467594 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2026-04-20 00:11:14.467607 | orchestrator | 2026-04-20 00:11:14.467618 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2026-04-20 00:11:15.054374 | orchestrator | changed: [testbed-manager] 2026-04-20 00:11:15.054415 | orchestrator | 2026-04-20 00:11:15.054422 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2026-04-20 00:14:41.414860 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2026-04-20 00:14:41.414969 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2026-04-20 00:14:41.414985 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2026-04-20 00:14:41.414997 | orchestrator | 2026-04-20 00:14:41.415010 | orchestrator | TASK [Install local collections] *********************************************** 2026-04-20 00:14:43.670901 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2026-04-20 00:14:43.671000 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2026-04-20 00:14:43.671015 | orchestrator | 2026-04-20 00:14:43.671030 | orchestrator | PLAY [Create operator user] **************************************************** 2026-04-20 00:14:43.671042 | orchestrator | 2026-04-20 00:14:43.671054 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:14:45.078334 | orchestrator | ok: [testbed-manager] 2026-04-20 00:14:45.078431 | orchestrator | 2026-04-20 00:14:45.078447 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2026-04-20 00:14:45.124383 | orchestrator | ok: [testbed-manager] 2026-04-20 00:14:45.124481 | orchestrator | 2026-04-20 00:14:45.124498 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2026-04-20 00:14:45.190948 | orchestrator | ok: [testbed-manager] 2026-04-20 00:14:45.191063 | orchestrator | 2026-04-20 00:14:45.191092 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2026-04-20 00:14:45.947860 | orchestrator | changed: [testbed-manager] 2026-04-20 00:14:45.947962 | orchestrator | 2026-04-20 00:14:45.947984 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2026-04-20 00:14:46.653760 | orchestrator | changed: [testbed-manager] 2026-04-20 00:14:46.653863 | orchestrator | 2026-04-20 00:14:46.653889 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2026-04-20 00:14:47.966541 | orchestrator | changed: [testbed-manager] => (item=adm) 2026-04-20 00:14:47.966625 | orchestrator | changed: [testbed-manager] => (item=sudo) 2026-04-20 00:14:47.966646 | orchestrator | 2026-04-20 00:14:47.966666 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2026-04-20 00:14:49.303391 | orchestrator | changed: [testbed-manager] 2026-04-20 00:14:49.303482 | orchestrator | 2026-04-20 00:14:49.303499 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2026-04-20 00:14:51.045465 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:14:51.045567 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2026-04-20 00:14:51.045599 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:14:51.045612 | orchestrator | 2026-04-20 00:14:51.045625 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2026-04-20 00:14:51.102431 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:51.102487 | orchestrator | 2026-04-20 00:14:51.102493 | orchestrator | TASK [osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file] *** 2026-04-20 00:14:51.168298 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:51.168351 | orchestrator | 2026-04-20 00:14:51.168357 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2026-04-20 00:14:51.721326 | orchestrator | changed: [testbed-manager] 2026-04-20 00:14:51.721422 | orchestrator | 2026-04-20 00:14:51.721440 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2026-04-20 00:14:51.790405 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:51.790456 | orchestrator | 2026-04-20 00:14:51.790462 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2026-04-20 00:14:52.621443 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-20 00:14:52.621526 | orchestrator | changed: [testbed-manager] 2026-04-20 00:14:52.621541 | orchestrator | 2026-04-20 00:14:52.621552 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2026-04-20 00:14:52.656530 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:52.656599 | orchestrator | 2026-04-20 00:14:52.656609 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2026-04-20 00:14:52.693016 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:52.693087 | orchestrator | 2026-04-20 00:14:52.693099 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2026-04-20 00:14:52.737237 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:52.737321 | orchestrator | 2026-04-20 00:14:52.737336 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2026-04-20 00:14:52.812863 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:14:52.812949 | orchestrator | 2026-04-20 00:14:52.812964 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2026-04-20 00:14:53.508783 | orchestrator | ok: [testbed-manager] 2026-04-20 00:14:53.508891 | orchestrator | 2026-04-20 00:14:53.508909 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2026-04-20 00:14:53.508922 | orchestrator | 2026-04-20 00:14:53.508935 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:14:54.835399 | orchestrator | ok: [testbed-manager] 2026-04-20 00:14:54.835498 | orchestrator | 2026-04-20 00:14:54.835515 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2026-04-20 00:14:55.784901 | orchestrator | changed: [testbed-manager] 2026-04-20 00:14:55.784949 | orchestrator | 2026-04-20 00:14:55.784955 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:14:55.784961 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2026-04-20 00:14:55.784965 | orchestrator | 2026-04-20 00:14:56.270754 | orchestrator | ok: Runtime: 0:09:22.623457 2026-04-20 00:14:56.291073 | 2026-04-20 00:14:56.291242 | TASK [Point out that the log in on the manager is now possible] 2026-04-20 00:14:56.334477 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2026-04-20 00:14:56.344640 | 2026-04-20 00:14:56.344768 | TASK [Point out that the following task takes some time and does not give any output] 2026-04-20 00:14:56.377666 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2026-04-20 00:14:56.385692 | 2026-04-20 00:14:56.385806 | TASK [Run manager part 1 + 2] 2026-04-20 00:14:57.659944 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2026-04-20 00:14:57.738393 | orchestrator | 2026-04-20 00:14:57.738489 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2026-04-20 00:14:57.738497 | orchestrator | 2026-04-20 00:14:57.738511 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:15:00.612506 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:00.612601 | orchestrator | 2026-04-20 00:15:00.612657 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2026-04-20 00:15:00.648953 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:15:00.649025 | orchestrator | 2026-04-20 00:15:00.649043 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2026-04-20 00:15:00.695583 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:00.695646 | orchestrator | 2026-04-20 00:15:00.695660 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-04-20 00:15:00.733463 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:00.733557 | orchestrator | 2026-04-20 00:15:00.733575 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-04-20 00:15:00.809275 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:00.809336 | orchestrator | 2026-04-20 00:15:00.809344 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-04-20 00:15:00.867970 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:00.868026 | orchestrator | 2026-04-20 00:15:00.868033 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-04-20 00:15:00.913404 | orchestrator | included: /home/zuul-testbed03/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2026-04-20 00:15:00.913462 | orchestrator | 2026-04-20 00:15:00.913469 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-04-20 00:15:01.624710 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:01.624808 | orchestrator | 2026-04-20 00:15:01.624901 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-04-20 00:15:01.674277 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:15:01.674340 | orchestrator | 2026-04-20 00:15:01.674349 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-04-20 00:15:03.043206 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:03.043289 | orchestrator | 2026-04-20 00:15:03.043310 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-04-20 00:15:03.619769 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:03.619863 | orchestrator | 2026-04-20 00:15:03.619880 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-04-20 00:15:04.746314 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:04.746435 | orchestrator | 2026-04-20 00:15:04.746502 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-04-20 00:15:20.018638 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:20.018773 | orchestrator | 2026-04-20 00:15:20.018796 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2026-04-20 00:15:20.655999 | orchestrator | ok: [testbed-manager] 2026-04-20 00:15:20.656077 | orchestrator | 2026-04-20 00:15:20.656090 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2026-04-20 00:15:20.716664 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:15:20.716757 | orchestrator | 2026-04-20 00:15:20.716773 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2026-04-20 00:15:21.595999 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:21.596084 | orchestrator | 2026-04-20 00:15:21.596101 | orchestrator | TASK [Copy SSH private key] **************************************************** 2026-04-20 00:15:22.442791 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:22.442905 | orchestrator | 2026-04-20 00:15:22.442932 | orchestrator | TASK [Create configuration directory] ****************************************** 2026-04-20 00:15:22.961119 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:22.961247 | orchestrator | 2026-04-20 00:15:22.961265 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2026-04-20 00:15:22.997320 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2026-04-20 00:15:22.997404 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2026-04-20 00:15:22.997414 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2026-04-20 00:15:22.997422 | orchestrator | deprecation_warnings=False in ansible.cfg. 2026-04-20 00:15:25.228879 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:25.228944 | orchestrator | 2026-04-20 00:15:25.228952 | orchestrator | TASK [Install python requirements in venv] ************************************* 2026-04-20 00:15:33.870591 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2026-04-20 00:15:33.870679 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2026-04-20 00:15:33.870693 | orchestrator | ok: [testbed-manager] => (item=packaging) 2026-04-20 00:15:33.870703 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2026-04-20 00:15:33.870720 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2026-04-20 00:15:33.870730 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2026-04-20 00:15:33.870740 | orchestrator | 2026-04-20 00:15:33.870750 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2026-04-20 00:15:34.885213 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:34.885260 | orchestrator | 2026-04-20 00:15:34.885268 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2026-04-20 00:15:37.933628 | orchestrator | changed: [testbed-manager] 2026-04-20 00:15:37.933714 | orchestrator | 2026-04-20 00:15:37.933730 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2026-04-20 00:15:37.983322 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:15:37.983437 | orchestrator | 2026-04-20 00:15:37.983453 | orchestrator | TASK [Run manager part 2] ****************************************************** 2026-04-20 00:17:17.320840 | orchestrator | changed: [testbed-manager] 2026-04-20 00:17:17.320959 | orchestrator | 2026-04-20 00:17:17.320978 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-04-20 00:17:18.506767 | orchestrator | ok: [testbed-manager] 2026-04-20 00:17:18.506842 | orchestrator | 2026-04-20 00:17:18.506861 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:17:18.506876 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 2026-04-20 00:17:18.506888 | orchestrator | 2026-04-20 00:17:19.020958 | orchestrator | ok: Runtime: 0:02:21.913947 2026-04-20 00:17:19.037213 | 2026-04-20 00:17:19.037423 | TASK [Reboot manager] 2026-04-20 00:17:20.572093 | orchestrator | ok: Runtime: 0:00:01.002184 2026-04-20 00:17:20.587935 | 2026-04-20 00:17:20.588094 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2026-04-20 00:17:34.620763 | orchestrator | ok 2026-04-20 00:17:34.630685 | 2026-04-20 00:17:34.630814 | TASK [Wait a little longer for the manager so that everything is ready] 2026-04-20 00:18:34.660417 | orchestrator | ok 2026-04-20 00:18:34.667532 | 2026-04-20 00:18:34.667639 | TASK [Deploy manager + bootstrap nodes] 2026-04-20 00:18:36.935917 | orchestrator | 2026-04-20 00:18:36.936157 | orchestrator | # DEPLOY MANAGER 2026-04-20 00:18:36.936184 | orchestrator | 2026-04-20 00:18:36.936199 | orchestrator | + set -e 2026-04-20 00:18:36.936212 | orchestrator | + echo 2026-04-20 00:18:36.936226 | orchestrator | + echo '# DEPLOY MANAGER' 2026-04-20 00:18:36.936243 | orchestrator | + echo 2026-04-20 00:18:36.936294 | orchestrator | + cat /opt/manager-vars.sh 2026-04-20 00:18:36.938552 | orchestrator | export NUMBER_OF_NODES=6 2026-04-20 00:18:36.938578 | orchestrator | 2026-04-20 00:18:36.938591 | orchestrator | export CEPH_VERSION=reef 2026-04-20 00:18:36.938604 | orchestrator | export CONFIGURATION_VERSION=main 2026-04-20 00:18:36.938616 | orchestrator | export MANAGER_VERSION=10.0.0 2026-04-20 00:18:36.938638 | orchestrator | export OPENSTACK_VERSION=2024.2 2026-04-20 00:18:36.938649 | orchestrator | 2026-04-20 00:18:36.938666 | orchestrator | export ARA=false 2026-04-20 00:18:36.938678 | orchestrator | export DEPLOY_MODE=manager 2026-04-20 00:18:36.938695 | orchestrator | export TEMPEST=true 2026-04-20 00:18:36.938706 | orchestrator | export IS_ZUUL=true 2026-04-20 00:18:36.938717 | orchestrator | 2026-04-20 00:18:36.938735 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:18:36.938746 | orchestrator | export EXTERNAL_API=false 2026-04-20 00:18:36.938757 | orchestrator | 2026-04-20 00:18:36.938767 | orchestrator | export IMAGE_USER=ubuntu 2026-04-20 00:18:36.938782 | orchestrator | export IMAGE_NODE_USER=ubuntu 2026-04-20 00:18:36.938793 | orchestrator | 2026-04-20 00:18:36.938803 | orchestrator | export CEPH_STACK=ceph-ansible 2026-04-20 00:18:36.938820 | orchestrator | 2026-04-20 00:18:36.938831 | orchestrator | + echo 2026-04-20 00:18:36.938843 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-04-20 00:18:36.939499 | orchestrator | ++ export INTERACTIVE=false 2026-04-20 00:18:36.939516 | orchestrator | ++ INTERACTIVE=false 2026-04-20 00:18:36.939527 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-04-20 00:18:36.939539 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-04-20 00:18:36.939799 | orchestrator | + source /opt/manager-vars.sh 2026-04-20 00:18:36.939814 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-04-20 00:18:36.939826 | orchestrator | ++ NUMBER_OF_NODES=6 2026-04-20 00:18:36.939840 | orchestrator | ++ export CEPH_VERSION=reef 2026-04-20 00:18:36.939852 | orchestrator | ++ CEPH_VERSION=reef 2026-04-20 00:18:36.939863 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-04-20 00:18:36.939874 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-04-20 00:18:36.939884 | orchestrator | ++ export MANAGER_VERSION=10.0.0 2026-04-20 00:18:36.939895 | orchestrator | ++ MANAGER_VERSION=10.0.0 2026-04-20 00:18:36.939910 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2026-04-20 00:18:36.939929 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2026-04-20 00:18:36.939940 | orchestrator | ++ export ARA=false 2026-04-20 00:18:36.939951 | orchestrator | ++ ARA=false 2026-04-20 00:18:36.939961 | orchestrator | ++ export DEPLOY_MODE=manager 2026-04-20 00:18:36.939975 | orchestrator | ++ DEPLOY_MODE=manager 2026-04-20 00:18:36.939986 | orchestrator | ++ export TEMPEST=true 2026-04-20 00:18:36.939997 | orchestrator | ++ TEMPEST=true 2026-04-20 00:18:36.940008 | orchestrator | ++ export IS_ZUUL=true 2026-04-20 00:18:36.940021 | orchestrator | ++ IS_ZUUL=true 2026-04-20 00:18:36.940140 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:18:36.940156 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:18:36.940167 | orchestrator | ++ export EXTERNAL_API=false 2026-04-20 00:18:36.940178 | orchestrator | ++ EXTERNAL_API=false 2026-04-20 00:18:36.940193 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-04-20 00:18:36.940204 | orchestrator | ++ IMAGE_USER=ubuntu 2026-04-20 00:18:36.940215 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-04-20 00:18:36.940226 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-04-20 00:18:36.940237 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-04-20 00:18:36.940251 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-04-20 00:18:36.940262 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2026-04-20 00:18:36.990968 | orchestrator | + docker version 2026-04-20 00:18:37.099734 | orchestrator | Client: Docker Engine - Community 2026-04-20 00:18:37.099832 | orchestrator | Version: 27.5.1 2026-04-20 00:18:37.099845 | orchestrator | API version: 1.47 2026-04-20 00:18:37.099859 | orchestrator | Go version: go1.22.11 2026-04-20 00:18:37.099870 | orchestrator | Git commit: 9f9e405 2026-04-20 00:18:37.099881 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2026-04-20 00:18:37.099893 | orchestrator | OS/Arch: linux/amd64 2026-04-20 00:18:37.099904 | orchestrator | Context: default 2026-04-20 00:18:37.099914 | orchestrator | 2026-04-20 00:18:37.099925 | orchestrator | Server: Docker Engine - Community 2026-04-20 00:18:37.099936 | orchestrator | Engine: 2026-04-20 00:18:37.099947 | orchestrator | Version: 27.5.1 2026-04-20 00:18:37.099959 | orchestrator | API version: 1.47 (minimum version 1.24) 2026-04-20 00:18:37.100000 | orchestrator | Go version: go1.22.11 2026-04-20 00:18:37.100011 | orchestrator | Git commit: 4c9b3b0 2026-04-20 00:18:37.100022 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2026-04-20 00:18:37.100033 | orchestrator | OS/Arch: linux/amd64 2026-04-20 00:18:37.100043 | orchestrator | Experimental: false 2026-04-20 00:18:37.100090 | orchestrator | containerd: 2026-04-20 00:18:37.100102 | orchestrator | Version: v2.2.3 2026-04-20 00:18:37.100113 | orchestrator | GitCommit: 77c84241c7cbdd9b4eca2591793e3d4f4317c590 2026-04-20 00:18:37.100124 | orchestrator | runc: 2026-04-20 00:18:37.100136 | orchestrator | Version: 1.3.5 2026-04-20 00:18:37.100147 | orchestrator | GitCommit: v1.3.5-0-g488fc13e 2026-04-20 00:18:37.100158 | orchestrator | docker-init: 2026-04-20 00:18:37.100168 | orchestrator | Version: 0.19.0 2026-04-20 00:18:37.100180 | orchestrator | GitCommit: de40ad0 2026-04-20 00:18:37.102429 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2026-04-20 00:18:37.110948 | orchestrator | + set -e 2026-04-20 00:18:37.110977 | orchestrator | + source /opt/manager-vars.sh 2026-04-20 00:18:37.110988 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-04-20 00:18:37.110999 | orchestrator | ++ NUMBER_OF_NODES=6 2026-04-20 00:18:37.111010 | orchestrator | ++ export CEPH_VERSION=reef 2026-04-20 00:18:37.111021 | orchestrator | ++ CEPH_VERSION=reef 2026-04-20 00:18:37.111032 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-04-20 00:18:37.111043 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-04-20 00:18:37.111078 | orchestrator | ++ export MANAGER_VERSION=10.0.0 2026-04-20 00:18:37.111089 | orchestrator | ++ MANAGER_VERSION=10.0.0 2026-04-20 00:18:37.111101 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2026-04-20 00:18:37.111111 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2026-04-20 00:18:37.111122 | orchestrator | ++ export ARA=false 2026-04-20 00:18:37.111133 | orchestrator | ++ ARA=false 2026-04-20 00:18:37.111144 | orchestrator | ++ export DEPLOY_MODE=manager 2026-04-20 00:18:37.111155 | orchestrator | ++ DEPLOY_MODE=manager 2026-04-20 00:18:37.111166 | orchestrator | ++ export TEMPEST=true 2026-04-20 00:18:37.111176 | orchestrator | ++ TEMPEST=true 2026-04-20 00:18:37.111187 | orchestrator | ++ export IS_ZUUL=true 2026-04-20 00:18:37.111198 | orchestrator | ++ IS_ZUUL=true 2026-04-20 00:18:37.111209 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:18:37.111220 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:18:37.111231 | orchestrator | ++ export EXTERNAL_API=false 2026-04-20 00:18:37.111245 | orchestrator | ++ EXTERNAL_API=false 2026-04-20 00:18:37.111256 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-04-20 00:18:37.111267 | orchestrator | ++ IMAGE_USER=ubuntu 2026-04-20 00:18:37.111278 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-04-20 00:18:37.111289 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-04-20 00:18:37.111300 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-04-20 00:18:37.111310 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-04-20 00:18:37.111321 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-04-20 00:18:37.111332 | orchestrator | ++ export INTERACTIVE=false 2026-04-20 00:18:37.111343 | orchestrator | ++ INTERACTIVE=false 2026-04-20 00:18:37.111354 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-04-20 00:18:37.111369 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-04-20 00:18:37.111510 | orchestrator | + [[ 10.0.0 != \l\a\t\e\s\t ]] 2026-04-20 00:18:37.111526 | orchestrator | + /opt/configuration/scripts/set-manager-version.sh 10.0.0 2026-04-20 00:18:37.118401 | orchestrator | + set -e 2026-04-20 00:18:37.118457 | orchestrator | + VERSION=10.0.0 2026-04-20 00:18:37.118474 | orchestrator | + sed -i 's/manager_version: .*/manager_version: 10.0.0/g' /opt/configuration/environments/manager/configuration.yml 2026-04-20 00:18:37.123549 | orchestrator | + [[ 10.0.0 != \l\a\t\e\s\t ]] 2026-04-20 00:18:37.123580 | orchestrator | + sed -i /ceph_version:/d /opt/configuration/environments/manager/configuration.yml 2026-04-20 00:18:37.126657 | orchestrator | + sed -i /openstack_version:/d /opt/configuration/environments/manager/configuration.yml 2026-04-20 00:18:37.129551 | orchestrator | + sh -c /opt/configuration/scripts/sync-configuration-repository.sh 2026-04-20 00:18:37.136805 | orchestrator | /opt/configuration ~ 2026-04-20 00:18:37.136858 | orchestrator | + set -e 2026-04-20 00:18:37.136874 | orchestrator | + pushd /opt/configuration 2026-04-20 00:18:37.136889 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-20 00:18:37.138605 | orchestrator | + source /opt/venv/bin/activate 2026-04-20 00:18:37.139421 | orchestrator | ++ deactivate nondestructive 2026-04-20 00:18:37.139439 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:37.139454 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:37.139489 | orchestrator | ++ hash -r 2026-04-20 00:18:37.139502 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:37.139513 | orchestrator | ++ unset VIRTUAL_ENV 2026-04-20 00:18:37.139607 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2026-04-20 00:18:37.139623 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2026-04-20 00:18:37.139635 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2026-04-20 00:18:37.139646 | orchestrator | ++ '[' linux-gnu = msys ']' 2026-04-20 00:18:37.139656 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2026-04-20 00:18:37.139667 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2026-04-20 00:18:37.139678 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-20 00:18:37.139694 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-20 00:18:37.139705 | orchestrator | ++ export PATH 2026-04-20 00:18:37.139717 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:37.139727 | orchestrator | ++ '[' -z '' ']' 2026-04-20 00:18:37.139738 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2026-04-20 00:18:37.139752 | orchestrator | ++ PS1='(venv) ' 2026-04-20 00:18:37.139763 | orchestrator | ++ export PS1 2026-04-20 00:18:37.139774 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2026-04-20 00:18:37.139784 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2026-04-20 00:18:37.139795 | orchestrator | ++ hash -r 2026-04-20 00:18:37.139806 | orchestrator | + pip3 install --no-cache-dir python-gilt==1.2.3 requests Jinja2 PyYAML packaging 2026-04-20 00:18:38.054173 | orchestrator | Requirement already satisfied: python-gilt==1.2.3 in /opt/venv/lib/python3.12/site-packages (1.2.3) 2026-04-20 00:18:38.054550 | orchestrator | Requirement already satisfied: requests in /opt/venv/lib/python3.12/site-packages (2.33.1) 2026-04-20 00:18:38.056038 | orchestrator | Requirement already satisfied: Jinja2 in /opt/venv/lib/python3.12/site-packages (3.1.6) 2026-04-20 00:18:38.057780 | orchestrator | Requirement already satisfied: PyYAML in /opt/venv/lib/python3.12/site-packages (6.0.3) 2026-04-20 00:18:38.058948 | orchestrator | Requirement already satisfied: packaging in /opt/venv/lib/python3.12/site-packages (26.1) 2026-04-20 00:18:38.068391 | orchestrator | Requirement already satisfied: click in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (8.3.2) 2026-04-20 00:18:38.069662 | orchestrator | Requirement already satisfied: colorama in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.4.6) 2026-04-20 00:18:38.070690 | orchestrator | Requirement already satisfied: fasteners in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.20) 2026-04-20 00:18:38.071989 | orchestrator | Requirement already satisfied: sh in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (2.2.2) 2026-04-20 00:18:38.094675 | orchestrator | Requirement already satisfied: charset_normalizer<4,>=2 in /opt/venv/lib/python3.12/site-packages (from requests) (3.4.7) 2026-04-20 00:18:38.096008 | orchestrator | Requirement already satisfied: idna<4,>=2.5 in /opt/venv/lib/python3.12/site-packages (from requests) (3.11) 2026-04-20 00:18:38.097426 | orchestrator | Requirement already satisfied: urllib3<3,>=1.26 in /opt/venv/lib/python3.12/site-packages (from requests) (2.6.3) 2026-04-20 00:18:38.098828 | orchestrator | Requirement already satisfied: certifi>=2023.5.7 in /opt/venv/lib/python3.12/site-packages (from requests) (2026.2.25) 2026-04-20 00:18:38.102399 | orchestrator | Requirement already satisfied: MarkupSafe>=2.0 in /opt/venv/lib/python3.12/site-packages (from Jinja2) (3.0.3) 2026-04-20 00:18:38.266590 | orchestrator | ++ which gilt 2026-04-20 00:18:38.268676 | orchestrator | + GILT=/opt/venv/bin/gilt 2026-04-20 00:18:38.268729 | orchestrator | + /opt/venv/bin/gilt overlay 2026-04-20 00:18:38.469331 | orchestrator | osism.cfg-generics: 2026-04-20 00:18:38.597549 | orchestrator | - copied (v0.20260319.0) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/environments/manager/images.yml to /opt/configuration/environments/manager/ 2026-04-20 00:18:38.597659 | orchestrator | - copied (v0.20260319.0) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/render-images.py to /opt/configuration/environments/manager/ 2026-04-20 00:18:38.597688 | orchestrator | - copied (v0.20260319.0) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/set-versions.py to /opt/configuration/environments/ 2026-04-20 00:18:38.597703 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh render-images` in /opt/configuration/environments/manager/ 2026-04-20 00:18:39.362187 | orchestrator | - running `rm render-images.py` in /opt/configuration/environments/manager/ 2026-04-20 00:18:39.373094 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh set-versions` in /opt/configuration/environments/ 2026-04-20 00:18:39.690192 | orchestrator | - running `rm set-versions.py` in /opt/configuration/environments/ 2026-04-20 00:18:39.728492 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-20 00:18:39.728582 | orchestrator | + deactivate 2026-04-20 00:18:39.728597 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2026-04-20 00:18:39.728611 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-20 00:18:39.728622 | orchestrator | + export PATH 2026-04-20 00:18:39.728633 | orchestrator | + unset _OLD_VIRTUAL_PATH 2026-04-20 00:18:39.728645 | orchestrator | + '[' -n '' ']' 2026-04-20 00:18:39.728659 | orchestrator | + hash -r 2026-04-20 00:18:39.728669 | orchestrator | + '[' -n '' ']' 2026-04-20 00:18:39.728680 | orchestrator | + unset VIRTUAL_ENV 2026-04-20 00:18:39.728691 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2026-04-20 00:18:39.728701 | orchestrator | + '[' '!' '' = nondestructive ']' 2026-04-20 00:18:39.728712 | orchestrator | + unset -f deactivate 2026-04-20 00:18:39.728723 | orchestrator | ~ 2026-04-20 00:18:39.728734 | orchestrator | + popd 2026-04-20 00:18:39.729789 | orchestrator | + [[ 10.0.0 == \l\a\t\e\s\t ]] 2026-04-20 00:18:39.729812 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2026-04-20 00:18:39.730631 | orchestrator | ++ semver 10.0.0 7.0.0 2026-04-20 00:18:39.775551 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-20 00:18:39.775659 | orchestrator | + echo 'enable_osism_kubernetes: true' 2026-04-20 00:18:39.775673 | orchestrator | + [[ 10.0.0 == \l\a\t\e\s\t ]] 2026-04-20 00:18:39.775704 | orchestrator | ++ semver 10.0.0 10.0.0-0 2026-04-20 00:18:39.820502 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-20 00:18:39.820599 | orchestrator | + sed -i '/^om_enable_rabbitmq_high_availability:/d' /opt/configuration/environments/kolla/configuration.yml 2026-04-20 00:18:39.824371 | orchestrator | + sed -i '/^om_enable_rabbitmq_quorum_queues:/d' /opt/configuration/environments/kolla/configuration.yml 2026-04-20 00:18:39.829126 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2026-04-20 00:18:39.917197 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-20 00:18:39.917304 | orchestrator | + source /opt/venv/bin/activate 2026-04-20 00:18:39.917320 | orchestrator | ++ deactivate nondestructive 2026-04-20 00:18:39.917331 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:39.917342 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:39.917353 | orchestrator | ++ hash -r 2026-04-20 00:18:39.917364 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:39.917374 | orchestrator | ++ unset VIRTUAL_ENV 2026-04-20 00:18:39.917385 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2026-04-20 00:18:39.917396 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2026-04-20 00:18:39.917420 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2026-04-20 00:18:39.917432 | orchestrator | ++ '[' linux-gnu = msys ']' 2026-04-20 00:18:39.917470 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2026-04-20 00:18:39.917481 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2026-04-20 00:18:39.917492 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-20 00:18:39.917504 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-20 00:18:39.917515 | orchestrator | ++ export PATH 2026-04-20 00:18:39.917685 | orchestrator | ++ '[' -n '' ']' 2026-04-20 00:18:39.917701 | orchestrator | ++ '[' -z '' ']' 2026-04-20 00:18:39.917712 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2026-04-20 00:18:39.917722 | orchestrator | ++ PS1='(venv) ' 2026-04-20 00:18:39.917733 | orchestrator | ++ export PS1 2026-04-20 00:18:39.917744 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2026-04-20 00:18:39.917754 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2026-04-20 00:18:39.917765 | orchestrator | ++ hash -r 2026-04-20 00:18:39.918154 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2026-04-20 00:18:40.913437 | orchestrator | 2026-04-20 00:18:40.913564 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2026-04-20 00:18:40.913590 | orchestrator | 2026-04-20 00:18:40.913612 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-04-20 00:18:41.448615 | orchestrator | ok: [testbed-manager] 2026-04-20 00:18:41.448703 | orchestrator | 2026-04-20 00:18:41.448739 | orchestrator | TASK [Copy fact files] ********************************************************* 2026-04-20 00:18:42.389508 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:42.389590 | orchestrator | 2026-04-20 00:18:42.389599 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2026-04-20 00:18:42.389607 | orchestrator | 2026-04-20 00:18:42.389617 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:18:44.613300 | orchestrator | ok: [testbed-manager] 2026-04-20 00:18:44.613408 | orchestrator | 2026-04-20 00:18:44.613423 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2026-04-20 00:18:44.667028 | orchestrator | ok: [testbed-manager] 2026-04-20 00:18:44.667157 | orchestrator | 2026-04-20 00:18:44.667173 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2026-04-20 00:18:45.109645 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:45.109744 | orchestrator | 2026-04-20 00:18:45.109758 | orchestrator | TASK [Add netbox_enable parameter] ********************************************* 2026-04-20 00:18:45.152599 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:18:45.152692 | orchestrator | 2026-04-20 00:18:45.152706 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2026-04-20 00:18:45.490532 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:45.490663 | orchestrator | 2026-04-20 00:18:45.490692 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2026-04-20 00:18:45.808395 | orchestrator | ok: [testbed-manager] 2026-04-20 00:18:45.808495 | orchestrator | 2026-04-20 00:18:45.808509 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2026-04-20 00:18:45.899142 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:18:45.899238 | orchestrator | 2026-04-20 00:18:45.899252 | orchestrator | PLAY [Apply role traefik] ****************************************************** 2026-04-20 00:18:45.899265 | orchestrator | 2026-04-20 00:18:45.899276 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:18:47.634889 | orchestrator | ok: [testbed-manager] 2026-04-20 00:18:47.634995 | orchestrator | 2026-04-20 00:18:47.635010 | orchestrator | TASK [Apply traefik role] ****************************************************** 2026-04-20 00:18:47.725683 | orchestrator | included: osism.services.traefik for testbed-manager 2026-04-20 00:18:47.725792 | orchestrator | 2026-04-20 00:18:47.725817 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2026-04-20 00:18:47.771921 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2026-04-20 00:18:47.772008 | orchestrator | 2026-04-20 00:18:47.772025 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2026-04-20 00:18:48.860953 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2026-04-20 00:18:48.861082 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2026-04-20 00:18:48.861099 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2026-04-20 00:18:48.861111 | orchestrator | 2026-04-20 00:18:48.861123 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2026-04-20 00:18:50.628271 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2026-04-20 00:18:50.628374 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2026-04-20 00:18:50.628389 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2026-04-20 00:18:50.628401 | orchestrator | 2026-04-20 00:18:50.628413 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2026-04-20 00:18:51.270828 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-20 00:18:51.270921 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:51.270933 | orchestrator | 2026-04-20 00:18:51.270942 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2026-04-20 00:18:51.898756 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-20 00:18:51.898857 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:51.898873 | orchestrator | 2026-04-20 00:18:51.898885 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2026-04-20 00:18:51.953376 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:18:51.953459 | orchestrator | 2026-04-20 00:18:51.953473 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2026-04-20 00:18:52.316760 | orchestrator | ok: [testbed-manager] 2026-04-20 00:18:52.316859 | orchestrator | 2026-04-20 00:18:52.316876 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2026-04-20 00:18:52.406926 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2026-04-20 00:18:52.407022 | orchestrator | 2026-04-20 00:18:52.407037 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2026-04-20 00:18:53.480735 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:53.480841 | orchestrator | 2026-04-20 00:18:53.480856 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2026-04-20 00:18:54.284181 | orchestrator | changed: [testbed-manager] 2026-04-20 00:18:54.284309 | orchestrator | 2026-04-20 00:18:54.284327 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2026-04-20 00:19:12.585823 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:12.585956 | orchestrator | 2026-04-20 00:19:12.585983 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2026-04-20 00:19:12.643943 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:19:12.644073 | orchestrator | 2026-04-20 00:19:12.644089 | orchestrator | PLAY [Deploy manager service] ************************************************** 2026-04-20 00:19:12.644100 | orchestrator | 2026-04-20 00:19:12.644110 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:19:14.399936 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:14.400093 | orchestrator | 2026-04-20 00:19:14.400113 | orchestrator | TASK [Apply manager role] ****************************************************** 2026-04-20 00:19:14.514756 | orchestrator | included: osism.services.manager for testbed-manager 2026-04-20 00:19:14.514881 | orchestrator | 2026-04-20 00:19:14.514906 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2026-04-20 00:19:14.566591 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2026-04-20 00:19:14.566687 | orchestrator | 2026-04-20 00:19:14.566701 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2026-04-20 00:19:16.559445 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:16.559544 | orchestrator | 2026-04-20 00:19:16.559558 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2026-04-20 00:19:16.611911 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:16.612009 | orchestrator | 2026-04-20 00:19:16.612024 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2026-04-20 00:19:16.714436 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2026-04-20 00:19:16.714553 | orchestrator | 2026-04-20 00:19:16.714578 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2026-04-20 00:19:19.344827 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2026-04-20 00:19:19.344945 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2026-04-20 00:19:19.344961 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2026-04-20 00:19:19.344973 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2026-04-20 00:19:19.344986 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2026-04-20 00:19:19.344997 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2026-04-20 00:19:19.345008 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2026-04-20 00:19:19.345019 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2026-04-20 00:19:19.345030 | orchestrator | 2026-04-20 00:19:19.345088 | orchestrator | TASK [osism.services.manager : Copy all environment file] ********************** 2026-04-20 00:19:19.909652 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:19.909735 | orchestrator | 2026-04-20 00:19:19.909745 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2026-04-20 00:19:20.493331 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:20.493437 | orchestrator | 2026-04-20 00:19:20.493453 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2026-04-20 00:19:20.572931 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2026-04-20 00:19:20.573067 | orchestrator | 2026-04-20 00:19:20.573085 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2026-04-20 00:19:21.624838 | orchestrator | changed: [testbed-manager] => (item=ara) 2026-04-20 00:19:21.624932 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2026-04-20 00:19:21.624943 | orchestrator | 2026-04-20 00:19:21.624954 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2026-04-20 00:19:22.175845 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:22.175954 | orchestrator | 2026-04-20 00:19:22.175970 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2026-04-20 00:19:22.223518 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:19:22.223607 | orchestrator | 2026-04-20 00:19:22.223621 | orchestrator | TASK [osism.services.manager : Include frontend config tasks] ****************** 2026-04-20 00:19:22.285342 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-frontend.yml for testbed-manager 2026-04-20 00:19:22.285435 | orchestrator | 2026-04-20 00:19:22.285450 | orchestrator | TASK [osism.services.manager : Copy frontend environment file] ***************** 2026-04-20 00:19:22.867732 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:22.867836 | orchestrator | 2026-04-20 00:19:22.867852 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2026-04-20 00:19:22.918780 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2026-04-20 00:19:22.918856 | orchestrator | 2026-04-20 00:19:22.918865 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2026-04-20 00:19:24.187399 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-20 00:19:24.187483 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-20 00:19:24.187500 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:24.187511 | orchestrator | 2026-04-20 00:19:24.187522 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2026-04-20 00:19:24.747024 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:24.747190 | orchestrator | 2026-04-20 00:19:24.747206 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2026-04-20 00:19:24.793636 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:19:24.793730 | orchestrator | 2026-04-20 00:19:24.793744 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2026-04-20 00:19:24.883541 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2026-04-20 00:19:24.883662 | orchestrator | 2026-04-20 00:19:24.883689 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2026-04-20 00:19:25.342649 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:25.342750 | orchestrator | 2026-04-20 00:19:25.342766 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2026-04-20 00:19:25.720737 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:25.720867 | orchestrator | 2026-04-20 00:19:25.720895 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2026-04-20 00:19:26.808211 | orchestrator | changed: [testbed-manager] => (item=conductor) 2026-04-20 00:19:26.808318 | orchestrator | changed: [testbed-manager] => (item=openstack) 2026-04-20 00:19:26.808334 | orchestrator | 2026-04-20 00:19:26.808346 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2026-04-20 00:19:27.418363 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:27.418466 | orchestrator | 2026-04-20 00:19:27.418482 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2026-04-20 00:19:27.788648 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:27.788752 | orchestrator | 2026-04-20 00:19:27.788767 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2026-04-20 00:19:28.143666 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:28.143769 | orchestrator | 2026-04-20 00:19:28.143784 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2026-04-20 00:19:28.192493 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:19:28.192615 | orchestrator | 2026-04-20 00:19:28.192637 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2026-04-20 00:19:28.261227 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2026-04-20 00:19:28.261321 | orchestrator | 2026-04-20 00:19:28.261334 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2026-04-20 00:19:28.296330 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:28.296435 | orchestrator | 2026-04-20 00:19:28.296452 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2026-04-20 00:19:30.273436 | orchestrator | changed: [testbed-manager] => (item=osism) 2026-04-20 00:19:30.273567 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2026-04-20 00:19:30.273591 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2026-04-20 00:19:30.273624 | orchestrator | 2026-04-20 00:19:30.274460 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2026-04-20 00:19:30.962962 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:30.963124 | orchestrator | 2026-04-20 00:19:30.963142 | orchestrator | TASK [osism.services.manager : Copy hubble wrapper script] ********************* 2026-04-20 00:19:31.686466 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:31.686569 | orchestrator | 2026-04-20 00:19:31.686585 | orchestrator | TASK [osism.services.manager : Copy flux wrapper script] *********************** 2026-04-20 00:19:32.407806 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:32.407903 | orchestrator | 2026-04-20 00:19:32.407916 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2026-04-20 00:19:32.470391 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2026-04-20 00:19:32.470484 | orchestrator | 2026-04-20 00:19:32.470498 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2026-04-20 00:19:32.512821 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:32.512912 | orchestrator | 2026-04-20 00:19:32.512925 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2026-04-20 00:19:33.249389 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2026-04-20 00:19:33.249489 | orchestrator | 2026-04-20 00:19:33.249504 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2026-04-20 00:19:33.333319 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2026-04-20 00:19:33.333396 | orchestrator | 2026-04-20 00:19:33.333405 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2026-04-20 00:19:34.036876 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:34.036981 | orchestrator | 2026-04-20 00:19:34.036997 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2026-04-20 00:19:34.645751 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:34.645853 | orchestrator | 2026-04-20 00:19:34.645869 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2026-04-20 00:19:34.703794 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:19:34.703891 | orchestrator | 2026-04-20 00:19:34.703906 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2026-04-20 00:19:34.761365 | orchestrator | ok: [testbed-manager] 2026-04-20 00:19:34.761453 | orchestrator | 2026-04-20 00:19:34.761467 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2026-04-20 00:19:35.583862 | orchestrator | changed: [testbed-manager] 2026-04-20 00:19:35.583963 | orchestrator | 2026-04-20 00:19:35.583977 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2026-04-20 00:20:41.588293 | orchestrator | changed: [testbed-manager] 2026-04-20 00:20:41.588411 | orchestrator | 2026-04-20 00:20:41.588437 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2026-04-20 00:20:42.508398 | orchestrator | ok: [testbed-manager] 2026-04-20 00:20:42.508500 | orchestrator | 2026-04-20 00:20:42.508514 | orchestrator | TASK [osism.services.manager : Do a manual start of the manager service] ******* 2026-04-20 00:20:42.554395 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:20:42.554536 | orchestrator | 2026-04-20 00:20:42.554555 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2026-04-20 00:20:45.188223 | orchestrator | changed: [testbed-manager] 2026-04-20 00:20:45.188328 | orchestrator | 2026-04-20 00:20:45.188344 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2026-04-20 00:20:45.237039 | orchestrator | ok: [testbed-manager] 2026-04-20 00:20:45.237134 | orchestrator | 2026-04-20 00:20:45.237148 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2026-04-20 00:20:45.237160 | orchestrator | 2026-04-20 00:20:45.237171 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2026-04-20 00:20:45.382820 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:20:45.382918 | orchestrator | 2026-04-20 00:20:45.382956 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2026-04-20 00:21:45.450280 | orchestrator | Pausing for 60 seconds 2026-04-20 00:21:45.450393 | orchestrator | changed: [testbed-manager] 2026-04-20 00:21:45.450408 | orchestrator | 2026-04-20 00:21:45.450420 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2026-04-20 00:21:48.950234 | orchestrator | changed: [testbed-manager] 2026-04-20 00:21:48.950332 | orchestrator | 2026-04-20 00:21:48.950346 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2026-04-20 00:22:30.401155 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2026-04-20 00:22:30.401246 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2026-04-20 00:22:30.401255 | orchestrator | changed: [testbed-manager] 2026-04-20 00:22:30.401262 | orchestrator | 2026-04-20 00:22:30.401269 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2026-04-20 00:22:35.898258 | orchestrator | changed: [testbed-manager] 2026-04-20 00:22:35.898370 | orchestrator | 2026-04-20 00:22:35.898387 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2026-04-20 00:22:35.985069 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2026-04-20 00:22:35.985164 | orchestrator | 2026-04-20 00:22:35.985180 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2026-04-20 00:22:35.985192 | orchestrator | 2026-04-20 00:22:35.985203 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2026-04-20 00:22:36.036585 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:22:36.036673 | orchestrator | 2026-04-20 00:22:36.036692 | orchestrator | TASK [osism.services.manager : Include version verification tasks] ************* 2026-04-20 00:22:36.113788 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/verify-versions.yml for testbed-manager 2026-04-20 00:22:36.113883 | orchestrator | 2026-04-20 00:22:36.113897 | orchestrator | TASK [osism.services.manager : Deploy service manager version check script] **** 2026-04-20 00:22:36.848044 | orchestrator | changed: [testbed-manager] 2026-04-20 00:22:36.848149 | orchestrator | 2026-04-20 00:22:36.848166 | orchestrator | TASK [osism.services.manager : Execute service manager version check] ********** 2026-04-20 00:22:39.969821 | orchestrator | ok: [testbed-manager] 2026-04-20 00:22:39.969922 | orchestrator | 2026-04-20 00:22:39.969939 | orchestrator | TASK [osism.services.manager : Display version check results] ****************** 2026-04-20 00:22:40.029769 | orchestrator | ok: [testbed-manager] => { 2026-04-20 00:22:40.029865 | orchestrator | "version_check_result.stdout_lines": [ 2026-04-20 00:22:40.029881 | orchestrator | "=== OSISM Container Version Check ===", 2026-04-20 00:22:40.029894 | orchestrator | "Checking running containers against expected versions...", 2026-04-20 00:22:40.029906 | orchestrator | "", 2026-04-20 00:22:40.029918 | orchestrator | "Checking service: inventory_reconciler (Inventory Reconciler Service)", 2026-04-20 00:22:40.029930 | orchestrator | " Expected: registry.osism.tech/osism/inventory-reconciler:0.20260322.0", 2026-04-20 00:22:40.029985 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.029999 | orchestrator | " Running: registry.osism.tech/osism/inventory-reconciler:0.20260322.0", 2026-04-20 00:22:40.030010 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030076 | orchestrator | "", 2026-04-20 00:22:40.030117 | orchestrator | "Checking service: osism-ansible (OSISM Ansible Service)", 2026-04-20 00:22:40.030129 | orchestrator | " Expected: registry.osism.tech/osism/osism-ansible:0.20260322.0", 2026-04-20 00:22:40.030140 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030151 | orchestrator | " Running: registry.osism.tech/osism/osism-ansible:0.20260322.0", 2026-04-20 00:22:40.030161 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030172 | orchestrator | "", 2026-04-20 00:22:40.030183 | orchestrator | "Checking service: osism-kubernetes (Osism-Kubernetes Service)", 2026-04-20 00:22:40.030197 | orchestrator | " Expected: registry.osism.tech/osism/osism-kubernetes:0.20260322.0", 2026-04-20 00:22:40.030207 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030218 | orchestrator | " Running: registry.osism.tech/osism/osism-kubernetes:0.20260322.0", 2026-04-20 00:22:40.030229 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030240 | orchestrator | "", 2026-04-20 00:22:40.030250 | orchestrator | "Checking service: ceph-ansible (Ceph-Ansible Service)", 2026-04-20 00:22:40.030262 | orchestrator | " Expected: registry.osism.tech/osism/ceph-ansible:0.20260322.0", 2026-04-20 00:22:40.030272 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030283 | orchestrator | " Running: registry.osism.tech/osism/ceph-ansible:0.20260322.0", 2026-04-20 00:22:40.030294 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030305 | orchestrator | "", 2026-04-20 00:22:40.030318 | orchestrator | "Checking service: kolla-ansible (Kolla-Ansible Service)", 2026-04-20 00:22:40.030330 | orchestrator | " Expected: registry.osism.tech/osism/kolla-ansible:0.20260328.0", 2026-04-20 00:22:40.030342 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030355 | orchestrator | " Running: registry.osism.tech/osism/kolla-ansible:0.20260328.0", 2026-04-20 00:22:40.030367 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030379 | orchestrator | "", 2026-04-20 00:22:40.030393 | orchestrator | "Checking service: osismclient (OSISM Client)", 2026-04-20 00:22:40.030406 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030418 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030431 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030443 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030455 | orchestrator | "", 2026-04-20 00:22:40.030468 | orchestrator | "Checking service: ara-server (ARA Server)", 2026-04-20 00:22:40.030481 | orchestrator | " Expected: registry.osism.tech/osism/ara-server:1.7.3", 2026-04-20 00:22:40.030494 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030507 | orchestrator | " Running: registry.osism.tech/osism/ara-server:1.7.3", 2026-04-20 00:22:40.030520 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030533 | orchestrator | "", 2026-04-20 00:22:40.030545 | orchestrator | "Checking service: mariadb (MariaDB for ARA)", 2026-04-20 00:22:40.030559 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/mariadb:11.8.4", 2026-04-20 00:22:40.030571 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030584 | orchestrator | " Running: registry.osism.tech/dockerhub/library/mariadb:11.8.4", 2026-04-20 00:22:40.030596 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030607 | orchestrator | "", 2026-04-20 00:22:40.030617 | orchestrator | "Checking service: frontend (OSISM Frontend)", 2026-04-20 00:22:40.030628 | orchestrator | " Expected: registry.osism.tech/osism/osism-frontend:0.20260320.0", 2026-04-20 00:22:40.030639 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030649 | orchestrator | " Running: registry.osism.tech/osism/osism-frontend:0.20260320.0", 2026-04-20 00:22:40.030660 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030671 | orchestrator | "", 2026-04-20 00:22:40.030681 | orchestrator | "Checking service: redis (Redis Cache)", 2026-04-20 00:22:40.030692 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/redis:7.4.7-alpine", 2026-04-20 00:22:40.030703 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030714 | orchestrator | " Running: registry.osism.tech/dockerhub/library/redis:7.4.7-alpine", 2026-04-20 00:22:40.030724 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030735 | orchestrator | "", 2026-04-20 00:22:40.030782 | orchestrator | "Checking service: api (OSISM API Service)", 2026-04-20 00:22:40.030794 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030804 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030815 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030826 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030836 | orchestrator | "", 2026-04-20 00:22:40.030847 | orchestrator | "Checking service: listener (OpenStack Event Listener)", 2026-04-20 00:22:40.030858 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030869 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030879 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030890 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.030902 | orchestrator | "", 2026-04-20 00:22:40.030912 | orchestrator | "Checking service: openstack (OpenStack Integration)", 2026-04-20 00:22:40.030933 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.030978 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.030990 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.031001 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.031012 | orchestrator | "", 2026-04-20 00:22:40.031022 | orchestrator | "Checking service: beat (Celery Beat Scheduler)", 2026-04-20 00:22:40.031033 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.031044 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.031055 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.031083 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.031094 | orchestrator | "", 2026-04-20 00:22:40.031105 | orchestrator | "Checking service: flower (Celery Flower Monitor)", 2026-04-20 00:22:40.031116 | orchestrator | " Expected: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.031127 | orchestrator | " Enabled: true", 2026-04-20 00:22:40.031137 | orchestrator | " Running: registry.osism.tech/osism/osism:0.20260320.0", 2026-04-20 00:22:40.031148 | orchestrator | " Status: ✅ MATCH", 2026-04-20 00:22:40.031158 | orchestrator | "", 2026-04-20 00:22:40.031169 | orchestrator | "=== Summary ===", 2026-04-20 00:22:40.031180 | orchestrator | "Errors (version mismatches): 0", 2026-04-20 00:22:40.031190 | orchestrator | "Warnings (expected containers not running): 0", 2026-04-20 00:22:40.031201 | orchestrator | "", 2026-04-20 00:22:40.031212 | orchestrator | "✅ All running containers match expected versions!" 2026-04-20 00:22:40.031223 | orchestrator | ] 2026-04-20 00:22:40.031234 | orchestrator | } 2026-04-20 00:22:40.031245 | orchestrator | 2026-04-20 00:22:40.031257 | orchestrator | TASK [osism.services.manager : Skip version check due to service configuration] *** 2026-04-20 00:22:40.089425 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:22:40.089521 | orchestrator | 2026-04-20 00:22:40.089535 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:22:40.089548 | orchestrator | testbed-manager : ok=70 changed=37 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2026-04-20 00:22:40.089560 | orchestrator | 2026-04-20 00:22:40.190500 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2026-04-20 00:22:40.190604 | orchestrator | + deactivate 2026-04-20 00:22:40.190619 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2026-04-20 00:22:40.190633 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2026-04-20 00:22:40.190644 | orchestrator | + export PATH 2026-04-20 00:22:40.190655 | orchestrator | + unset _OLD_VIRTUAL_PATH 2026-04-20 00:22:40.190666 | orchestrator | + '[' -n '' ']' 2026-04-20 00:22:40.190677 | orchestrator | + hash -r 2026-04-20 00:22:40.190687 | orchestrator | + '[' -n '' ']' 2026-04-20 00:22:40.190698 | orchestrator | + unset VIRTUAL_ENV 2026-04-20 00:22:40.190708 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2026-04-20 00:22:40.190719 | orchestrator | + '[' '!' '' = nondestructive ']' 2026-04-20 00:22:40.190730 | orchestrator | + unset -f deactivate 2026-04-20 00:22:40.190742 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2026-04-20 00:22:40.196131 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2026-04-20 00:22:40.196174 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2026-04-20 00:22:40.196187 | orchestrator | + local max_attempts=60 2026-04-20 00:22:40.196199 | orchestrator | + local name=ceph-ansible 2026-04-20 00:22:40.196210 | orchestrator | + local attempt_num=1 2026-04-20 00:22:40.197160 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:22:40.232555 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:22:40.232652 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2026-04-20 00:22:40.232667 | orchestrator | + local max_attempts=60 2026-04-20 00:22:40.232679 | orchestrator | + local name=kolla-ansible 2026-04-20 00:22:40.232690 | orchestrator | + local attempt_num=1 2026-04-20 00:22:40.233343 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2026-04-20 00:22:40.269847 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:22:40.269933 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2026-04-20 00:22:40.270003 | orchestrator | + local max_attempts=60 2026-04-20 00:22:40.270068 | orchestrator | + local name=osism-ansible 2026-04-20 00:22:40.270090 | orchestrator | + local attempt_num=1 2026-04-20 00:22:40.271221 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2026-04-20 00:22:40.307637 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:22:40.307731 | orchestrator | + [[ true == \t\r\u\e ]] 2026-04-20 00:22:40.307744 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2026-04-20 00:22:40.943436 | orchestrator | + docker compose --project-directory /opt/manager ps 2026-04-20 00:22:41.106164 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2026-04-20 00:22:41.106279 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:0.20260322.0 "/entrypoint.sh osis…" ceph-ansible About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106321 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:0.20260328.0 "/entrypoint.sh osis…" kolla-ansible About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106337 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" api About a minute ago Up About a minute (healthy) 192.168.16.5:8000->8000/tcp 2026-04-20 00:22:41.106351 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.3 "sh -c '/wait && /ru…" ara-server About a minute ago Up About a minute (healthy) 8000/tcp 2026-04-20 00:22:41.106363 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" beat About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106375 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" flower About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106387 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:0.20260322.0 "/sbin/tini -- /entr…" inventory_reconciler About a minute ago Up 52 seconds (healthy) 2026-04-20 00:22:41.106399 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" listener About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106411 | orchestrator | manager-mariadb-1 registry.osism.tech/dockerhub/library/mariadb:11.8.4 "docker-entrypoint.s…" mariadb About a minute ago Up About a minute (healthy) 3306/tcp 2026-04-20 00:22:41.106423 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- osism…" openstack About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106435 | orchestrator | manager-redis-1 registry.osism.tech/dockerhub/library/redis:7.4.7-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp 2026-04-20 00:22:41.106476 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:0.20260322.0 "/entrypoint.sh osis…" osism-ansible About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106489 | orchestrator | osism-frontend registry.osism.tech/osism/osism-frontend:0.20260320.0 "docker-entrypoint.s…" frontend About a minute ago Up About a minute 192.168.16.5:3000->3000/tcp 2026-04-20 00:22:41.106499 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:0.20260322.0 "/entrypoint.sh osis…" osism-kubernetes About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.106512 | orchestrator | osismclient registry.osism.tech/osism/osism:0.20260320.0 "/sbin/tini -- sleep…" osismclient About a minute ago Up About a minute (healthy) 2026-04-20 00:22:41.111452 | orchestrator | ++ semver 10.0.0 7.0.0 2026-04-20 00:22:41.155897 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-20 00:22:41.155982 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2026-04-20 00:22:41.159141 | orchestrator | + osism apply resolvconf -l testbed-manager 2026-04-20 00:22:53.502733 | orchestrator | 2026-04-20 00:22:53 | INFO  | Prepare task for execution of resolvconf. 2026-04-20 00:22:53.710683 | orchestrator | 2026-04-20 00:22:53 | INFO  | Task f55f5d26-194d-48d4-99bf-14944dfe0afa (resolvconf) was prepared for execution. 2026-04-20 00:22:53.710790 | orchestrator | 2026-04-20 00:22:53 | INFO  | It takes a moment until task f55f5d26-194d-48d4-99bf-14944dfe0afa (resolvconf) has been started and output is visible here. 2026-04-20 00:23:06.450420 | orchestrator | 2026-04-20 00:23:06.450533 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2026-04-20 00:23:06.450547 | orchestrator | 2026-04-20 00:23:06.450558 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:23:06.450581 | orchestrator | Monday 20 April 2026 00:22:56 +0000 (0:00:00.175) 0:00:00.175 ********** 2026-04-20 00:23:06.450591 | orchestrator | ok: [testbed-manager] 2026-04-20 00:23:06.450602 | orchestrator | 2026-04-20 00:23:06.450612 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2026-04-20 00:23:06.450623 | orchestrator | Monday 20 April 2026 00:23:00 +0000 (0:00:03.627) 0:00:03.803 ********** 2026-04-20 00:23:06.450633 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:23:06.450643 | orchestrator | 2026-04-20 00:23:06.450653 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2026-04-20 00:23:06.450663 | orchestrator | Monday 20 April 2026 00:23:00 +0000 (0:00:00.054) 0:00:03.857 ********** 2026-04-20 00:23:06.450672 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2026-04-20 00:23:06.450683 | orchestrator | 2026-04-20 00:23:06.450693 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2026-04-20 00:23:06.450702 | orchestrator | Monday 20 April 2026 00:23:00 +0000 (0:00:00.077) 0:00:03.935 ********** 2026-04-20 00:23:06.450712 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2026-04-20 00:23:06.450722 | orchestrator | 2026-04-20 00:23:06.450731 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2026-04-20 00:23:06.450741 | orchestrator | Monday 20 April 2026 00:23:00 +0000 (0:00:00.068) 0:00:04.004 ********** 2026-04-20 00:23:06.450750 | orchestrator | ok: [testbed-manager] 2026-04-20 00:23:06.450760 | orchestrator | 2026-04-20 00:23:06.450769 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2026-04-20 00:23:06.450779 | orchestrator | Monday 20 April 2026 00:23:01 +0000 (0:00:01.124) 0:00:05.128 ********** 2026-04-20 00:23:06.450807 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:23:06.450817 | orchestrator | 2026-04-20 00:23:06.450827 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2026-04-20 00:23:06.450836 | orchestrator | Monday 20 April 2026 00:23:01 +0000 (0:00:00.053) 0:00:05.182 ********** 2026-04-20 00:23:06.450846 | orchestrator | ok: [testbed-manager] 2026-04-20 00:23:06.450855 | orchestrator | 2026-04-20 00:23:06.450865 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2026-04-20 00:23:06.450874 | orchestrator | Monday 20 April 2026 00:23:02 +0000 (0:00:00.522) 0:00:05.705 ********** 2026-04-20 00:23:06.450883 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:23:06.450893 | orchestrator | 2026-04-20 00:23:06.450903 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2026-04-20 00:23:06.450913 | orchestrator | Monday 20 April 2026 00:23:02 +0000 (0:00:00.077) 0:00:05.782 ********** 2026-04-20 00:23:06.450922 | orchestrator | changed: [testbed-manager] 2026-04-20 00:23:06.450932 | orchestrator | 2026-04-20 00:23:06.450941 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2026-04-20 00:23:06.450989 | orchestrator | Monday 20 April 2026 00:23:02 +0000 (0:00:00.562) 0:00:06.345 ********** 2026-04-20 00:23:06.451001 | orchestrator | changed: [testbed-manager] 2026-04-20 00:23:06.451013 | orchestrator | 2026-04-20 00:23:06.451039 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2026-04-20 00:23:06.451051 | orchestrator | Monday 20 April 2026 00:23:04 +0000 (0:00:01.084) 0:00:07.430 ********** 2026-04-20 00:23:06.451062 | orchestrator | ok: [testbed-manager] 2026-04-20 00:23:06.451073 | orchestrator | 2026-04-20 00:23:06.451083 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2026-04-20 00:23:06.451094 | orchestrator | Monday 20 April 2026 00:23:05 +0000 (0:00:00.993) 0:00:08.423 ********** 2026-04-20 00:23:06.451105 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2026-04-20 00:23:06.451116 | orchestrator | 2026-04-20 00:23:06.451127 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2026-04-20 00:23:06.451137 | orchestrator | Monday 20 April 2026 00:23:05 +0000 (0:00:00.082) 0:00:08.506 ********** 2026-04-20 00:23:06.451148 | orchestrator | changed: [testbed-manager] 2026-04-20 00:23:06.451159 | orchestrator | 2026-04-20 00:23:06.451170 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:23:06.451181 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2026-04-20 00:23:06.451192 | orchestrator | 2026-04-20 00:23:06.451203 | orchestrator | 2026-04-20 00:23:06.451214 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:23:06.451225 | orchestrator | Monday 20 April 2026 00:23:06 +0000 (0:00:01.133) 0:00:09.639 ********** 2026-04-20 00:23:06.451236 | orchestrator | =============================================================================== 2026-04-20 00:23:06.451246 | orchestrator | Gathering Facts --------------------------------------------------------- 3.63s 2026-04-20 00:23:06.451260 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.13s 2026-04-20 00:23:06.451278 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 1.12s 2026-04-20 00:23:06.451294 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 1.08s 2026-04-20 00:23:06.451311 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 0.99s 2026-04-20 00:23:06.451327 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.56s 2026-04-20 00:23:06.451365 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 0.52s 2026-04-20 00:23:06.451391 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.08s 2026-04-20 00:23:06.451411 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.08s 2026-04-20 00:23:06.451440 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.08s 2026-04-20 00:23:06.451458 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.07s 2026-04-20 00:23:06.451475 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.05s 2026-04-20 00:23:06.451486 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.05s 2026-04-20 00:23:06.622089 | orchestrator | + osism apply sshconfig 2026-04-20 00:23:17.996082 | orchestrator | 2026-04-20 00:23:17 | INFO  | Prepare task for execution of sshconfig. 2026-04-20 00:23:18.074523 | orchestrator | 2026-04-20 00:23:18 | INFO  | Task 8b8e7d62-606b-48f8-876d-9d0e33240646 (sshconfig) was prepared for execution. 2026-04-20 00:23:18.074625 | orchestrator | 2026-04-20 00:23:18 | INFO  | It takes a moment until task 8b8e7d62-606b-48f8-876d-9d0e33240646 (sshconfig) has been started and output is visible here. 2026-04-20 00:23:29.107883 | orchestrator | 2026-04-20 00:23:29.108039 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2026-04-20 00:23:29.108065 | orchestrator | 2026-04-20 00:23:29.108084 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2026-04-20 00:23:29.108104 | orchestrator | Monday 20 April 2026 00:23:21 +0000 (0:00:00.189) 0:00:00.189 ********** 2026-04-20 00:23:29.108123 | orchestrator | ok: [testbed-manager] 2026-04-20 00:23:29.108143 | orchestrator | 2026-04-20 00:23:29.108155 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2026-04-20 00:23:29.108166 | orchestrator | Monday 20 April 2026 00:23:22 +0000 (0:00:00.913) 0:00:01.102 ********** 2026-04-20 00:23:29.108177 | orchestrator | changed: [testbed-manager] 2026-04-20 00:23:29.108190 | orchestrator | 2026-04-20 00:23:29.108201 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2026-04-20 00:23:29.108212 | orchestrator | Monday 20 April 2026 00:23:22 +0000 (0:00:00.534) 0:00:01.637 ********** 2026-04-20 00:23:29.108222 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2026-04-20 00:23:29.108233 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2026-04-20 00:23:29.108244 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2026-04-20 00:23:29.108254 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2026-04-20 00:23:29.108265 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2026-04-20 00:23:29.108276 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2026-04-20 00:23:29.108286 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2026-04-20 00:23:29.108297 | orchestrator | 2026-04-20 00:23:29.108308 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2026-04-20 00:23:29.108318 | orchestrator | Monday 20 April 2026 00:23:28 +0000 (0:00:05.628) 0:00:07.265 ********** 2026-04-20 00:23:29.108329 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:23:29.108339 | orchestrator | 2026-04-20 00:23:29.108350 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2026-04-20 00:23:29.108361 | orchestrator | Monday 20 April 2026 00:23:28 +0000 (0:00:00.098) 0:00:07.364 ********** 2026-04-20 00:23:29.108371 | orchestrator | changed: [testbed-manager] 2026-04-20 00:23:29.108382 | orchestrator | 2026-04-20 00:23:29.108393 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:23:29.108405 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:23:29.108416 | orchestrator | 2026-04-20 00:23:29.108427 | orchestrator | 2026-04-20 00:23:29.108462 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:23:29.108474 | orchestrator | Monday 20 April 2026 00:23:28 +0000 (0:00:00.525) 0:00:07.889 ********** 2026-04-20 00:23:29.108484 | orchestrator | =============================================================================== 2026-04-20 00:23:29.108524 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 5.63s 2026-04-20 00:23:29.108535 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.91s 2026-04-20 00:23:29.108546 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.53s 2026-04-20 00:23:29.108556 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.53s 2026-04-20 00:23:29.108567 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.10s 2026-04-20 00:23:29.273302 | orchestrator | + osism apply known-hosts 2026-04-20 00:23:40.640203 | orchestrator | 2026-04-20 00:23:40 | INFO  | Prepare task for execution of known-hosts. 2026-04-20 00:23:40.711741 | orchestrator | 2026-04-20 00:23:40 | INFO  | Task 707a68ce-fd5b-406e-a4d1-4bd7cef71012 (known-hosts) was prepared for execution. 2026-04-20 00:23:40.711840 | orchestrator | 2026-04-20 00:23:40 | INFO  | It takes a moment until task 707a68ce-fd5b-406e-a4d1-4bd7cef71012 (known-hosts) has been started and output is visible here. 2026-04-20 00:23:56.126454 | orchestrator | 2026-04-20 00:23:56.126584 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2026-04-20 00:23:56.126609 | orchestrator | 2026-04-20 00:23:56.126627 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2026-04-20 00:23:56.126648 | orchestrator | Monday 20 April 2026 00:23:43 +0000 (0:00:00.192) 0:00:00.192 ********** 2026-04-20 00:23:56.126668 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2026-04-20 00:23:56.126686 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2026-04-20 00:23:56.126705 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2026-04-20 00:23:56.126724 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2026-04-20 00:23:56.126744 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2026-04-20 00:23:56.126764 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2026-04-20 00:23:56.126784 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2026-04-20 00:23:56.126801 | orchestrator | 2026-04-20 00:23:56.126834 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2026-04-20 00:23:56.126856 | orchestrator | Monday 20 April 2026 00:23:50 +0000 (0:00:06.423) 0:00:06.616 ********** 2026-04-20 00:23:56.126877 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2026-04-20 00:23:56.126898 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2026-04-20 00:23:56.126917 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2026-04-20 00:23:56.126938 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2026-04-20 00:23:56.126984 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2026-04-20 00:23:56.127006 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2026-04-20 00:23:56.127027 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2026-04-20 00:23:56.127048 | orchestrator | 2026-04-20 00:23:56.127069 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:23:56.127089 | orchestrator | Monday 20 April 2026 00:23:50 +0000 (0:00:00.166) 0:00:06.782 ********** 2026-04-20 00:23:56.127145 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdtTgSTIECGzduy7TeEMuiPHf4Zyfcxji8JzbBo8t3MgyFDFJ1M1779It4G35S6IaJzKa0wYM1z3Fr681TiAbzEbKTsb4A1Dn0ojC+yH605XbyEGYLnlp6S/PKU1SDcvA9bdGJYFcTj+otwY6ZoQZftnqcWn0oSj1tnjBri1Fl/7FDmkXfrF5imtMs1zvl5g40+vYqXj1X8bqO4NpgjzOwo7AYcAsJ+LUlxcWYlA6wUFbh6Sj/Jh8EDrP8YSQKpnAoH9tKSP0SptnOtv9ebOeo8N0S5a/dbbAHsRWOWNUpzofd+ZMhTNXdIPhPersp3cGZyMtN84uRbAsb5o/RMgIW7JB8the3XySUzbtNjstJXmLqC6m6q06nCTsIeurB3lpiOFFzY6EEZzpv1EstBNoWHrHVhER9nhmz7dgnliFFnUDVytKE1lcs+YymaRtMo8tSzMQ1xzjozJIYsTQCZgxwsLDd0wfAzUEDTvKSIaLExALPXDZoBVMDy+ohPlFyC98=) 2026-04-20 00:23:56.127177 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL0FPsNyE8ayEFl6Gld8LioSyF+nIurXoRSXRj2WBqDNTZ1wvGBLFE1rs3Q0ZsAtX2/1VHZSR7ZMz/zbOMG/4FA=) 2026-04-20 00:23:56.127198 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJ5qJxWoBXzjis/nwZfca9qjqqUPprNJK+tedHt0n993) 2026-04-20 00:23:56.127218 | orchestrator | 2026-04-20 00:23:56.127238 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:23:56.127258 | orchestrator | Monday 20 April 2026 00:23:51 +0000 (0:00:01.273) 0:00:08.056 ********** 2026-04-20 00:23:56.127278 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm8zu6dNL34zbSSKoNLtNwggkJtrspWzPWUKOhBxjSsFUiKgwhJfeOUtDcBVSqfHVSAhDkMWzar8UkyfZLZEUQ=) 2026-04-20 00:23:56.127297 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPOjWMiYTHtdb7gu6/m85xTjEZjDkh7b14jvCb49Z6Es) 2026-04-20 00:23:56.127352 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzraFKYr/vWrn0GIzECtGAkI5RzOJ5JeEuDoVlETYlidxGYUk28u/DFPW+N+dLPG3Mk42BThGQ/TCqe42PfkdnbYdNh4gvcTzNNDzQnYTAHzUcIoLTAOiJx4ag2Xghmhbaq6w4GzuVRgWJa6VOnhURtYUtFLRpvhyRV2KJvxLRPWq6hUlkpxWcTC3QrysUVwUqdcvCrlKqZxgsi/BWp2e6E9OCSKMdPS07OqYxIvt01s0ZJ//vcSGDH9BzeY2WVYRwP4FM8Rql9f9/c28rYn22xPF1QTFTF4Lh3eJp9XWndhMljOvyKGLSUDA5yPa1IFvrG3IRYbt/GpobwBqIUL0wnLZE5/kLU1RYuuwGGjBsYmmdSghVw4Ofn6zakF/gfMVk1uHACLR1g9oTjhwLLtQiyNIARdNX6lxH6/SjcRM6Rm5jkRIm0D3YZhBUnwyhQCGBoys/1Tr+ghxgVWET8Zrd7ondPbxRiFGZrMBdpylCCnk0m22n/70gsv8BGjQopxs=) 2026-04-20 00:23:56.127374 | orchestrator | 2026-04-20 00:23:56.127391 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:23:56.127410 | orchestrator | Monday 20 April 2026 00:23:52 +0000 (0:00:01.020) 0:00:09.076 ********** 2026-04-20 00:23:56.127428 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDVVl/34xTMhHDZ9E+NRyTo2O0NisWIqFbnXwX4USaGrjcB5MnvDaECiD6u/0pG6+RJxHfr2E8HcInvpMDYWGFR3kt8O8QvZ2iPr9ZPMP+wObVEOjuUwvVOcWOmI5RanPfouYrx5DCLzfakedAi2dl6ELJHYU/ANAkoDKlubD5ye2GKVcF5F+24EcCchJ2Ee+qHy8HYe3tmqmSwwEF4aIpnskWgTTGQKDJHSkOhfsKYUClgbE0No5ix3XbB7ILEi2qbteVel55dzXo8iPUObkSwINOczWOYeTCB2VYbgflRY8S+H3UvCtDUlNn6okydhdygqrBNMQTrXFyfsv2Pzo9t4IYnldw0EDFMmI0vQ0rZWm3jHKh36Pgb21Ly6Q++i7yOV+BB13LJUmKVFifI+VgYb2pyMp0BLMOqe+7Eb8ryGXtENdR+Jf2cpzK+mojh9yS4iBfZ4kj4hF5UWIyalS+NsM9akFswwaLg/lTAwKt86DE3reN3goMpOXLf7D4oXc=) 2026-04-20 00:23:56.127446 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK1RWtM+gsGNomJYlHoKAsCTS2beF/IuXWk3GuMThm99AD5z3ChiaPNpdYnqbU+vhzxfxuxSQfBLX0HdIFArkDQ=) 2026-04-20 00:23:56.127465 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDUTqemCCeW72aN8ml8lZLhhRD31XXmCGNGgXNqRo/e9) 2026-04-20 00:23:56.127482 | orchestrator | 2026-04-20 00:23:56.127500 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:23:56.127633 | orchestrator | Monday 20 April 2026 00:23:53 +0000 (0:00:01.028) 0:00:10.104 ********** 2026-04-20 00:23:56.127663 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCm2ujvuLDcGAU2suuR4Oh2VkLOnLCOqbKnMkRaB6bKW+L8hJVW3+skxtsQuo8gsNETGgP8M37Y5XuXxN9EA5rnsaX9njqWWnTmiyyKkfu6rHGtzoyRLO0MaklTvB4++uJqS9moYFmdS41ODnvvEoHFfFT2/P8ec5ubvVpj36B8UKG/O97X2tPjbN5Spl/xlyzMTnRe0q6djzB37nSu5d5rpChPMKRhc8mnpSELVazWuEOVPAugb+cSBlXTcxxhsLwH8KVqk/NJTrrV+lDRy9rkDJVho3djCo0rhcvJnSU/bw/j/UAbQov8RCgjk88IhkGyPAlvPMZ89wu0uE+cLTvmEQu69p2Gcc4SGwUzQirRXn17uM4MxXrfl9+RyDqwJ88zS6UoDmX85wl+WLQLAOL+Lalz3vZsgVUKFwn2XW9P/Ofqfovax3us1t0Dk96YDck7izQdzAWHDrkpU7WqXfxTX9KAnwBN4M5TMaj5ZPnInAghCklkVEna/jw+nnrVDOk=) 2026-04-20 00:23:56.127698 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLga5y18K4oLcBJxPb71uCpj1QDMSJEz8Bz2AoxlkAtv+52BjGsJnbpEhMj0uSohdCFlV0xKCisPH3swC39HX4A=) 2026-04-20 00:23:56.127718 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILgGQKBdKWhANtR1ROkq7QqWydur3uLI166GYXyqR1aF) 2026-04-20 00:23:56.127738 | orchestrator | 2026-04-20 00:23:56.127758 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:23:56.127778 | orchestrator | Monday 20 April 2026 00:23:54 +0000 (0:00:00.992) 0:00:11.097 ********** 2026-04-20 00:23:56.127797 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIPV4xOOW2ZMeiEi30Sy/RIhTSCQOj2R/vbAjwmTbmff) 2026-04-20 00:23:56.127817 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDaepzOfwGddJx1bj4LIxxsZA0Gd9YWxJ4M4oIFyooPvVMkf/0MnaZ8uwQ6Aqsn8gP6iyxPheTvLOfoe89IhbU857F28VnXGGnNeglPkQO131sjR1lgTwr5SoLRg57h9gkvlsmOUhBy5IOKqy2Dv0AerkWY/aYWOQqVRMJnWFrBsVNC/NlMnCRP3wyBHwIqy3tHAkIsMfH7dpoL8dbmCXckEb9CAkBohPkvd4KXd/FjHOw012TvxMpyqFUIZ5TXOoxHwnyZaioy7fdgRsT0MMXxm+RhwKHRTZzmDXA8CEALHX7wyimCj3oj7TpQruvtMNQMKwXrqET5taFjri3HtojDi7pLM4pIK+MD6mxOJ6n9zRDQO5J4qHboMq04NzWR0+8uUjNDKsl0z8YU9Hg6MupZT/SgXv0gNzfHRuAg7s5qlAx1ni0nA/QdKvOYR/kCAPjiqN9wtW3PU8hkiS6WH7+cuOJP0Cx7ZyZ/REVF1eE5vEuPDWs1p6oKmzAzs8oR2w8=) 2026-04-20 00:23:56.127837 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAyvItLvkE4KpqRFcdtALD093rxk41ioZXF4xNew8DicRXrixRa09OWCppwzHz24IdBt9VMzt0fKlfIrYbBBRF0=) 2026-04-20 00:23:56.127856 | orchestrator | 2026-04-20 00:23:56.127875 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:23:56.127895 | orchestrator | Monday 20 April 2026 00:23:55 +0000 (0:00:01.042) 0:00:12.140 ********** 2026-04-20 00:23:56.127930 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAeaQcoKTMATzIHXTcUVCniMkmP8VRb8E8j3ZCq7hYbg+nJOVCIHpPKtZmm7dgk3fk7xExChWHqqWK3K5rjOR2Hze4Jm8X3oHlmAWwO+yP7gp8ihVtD1nnhmaPAzmdCS9bb2Usg6TV3n34IlULCwlm5u6lBXOikrtdkAOeD3qM9nlI8iy3l+Gnl7thTVMan/LbQMxHmm+YS6tWH3el7oGfqbqWVf7F8DoRIJbyZezIbeijbJ+7j5curWx1kTkt/xN/ZxcDh/WjxUDxhoQAnJDDaanZfQ/QOK3+O9hEsjZti/3GfUVpmRzFc0pRjKsrD04wPoKsfDpaQjnh5T2pySuYo43WdTlZjPihhBxs5R+ccrEw0yFVKgKVLO1JIJ2A5ICh8ZwUuezf13+m0D2pfZk20YwbC+su3tr2cK1IOrHH9COyk62tpw3ofu8xnDR7T/7DTTmFnPmSRUOu3ot7QZ7Wgou3TPr6C7GO+uaB9dl/MpBvq96EKmMO4KIbC1j/59s=) 2026-04-20 00:24:07.263301 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH4CD7ntprqzQ7j6x7Ed0UO7RTkrtNWfadgxEDB3V+gzP+0UScaGRkWpZK+qPJfLboLk4eLVksJulOfuXxrRW7A=) 2026-04-20 00:24:07.263404 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB2NYAt6hgU7ljFuN5RUzcmy7zHMqzVZaZbEh3OqifyJ) 2026-04-20 00:24:07.263416 | orchestrator | 2026-04-20 00:24:07.263425 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:07.263433 | orchestrator | Monday 20 April 2026 00:23:56 +0000 (0:00:00.994) 0:00:13.134 ********** 2026-04-20 00:24:07.263442 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDuCZIAMnA6UBGkhtUqHzjVjGiORlyrBAZ6wj0lUpQbLgnoi2RmpUBpvfDUhCZS3Ga6xnrxsvos69GFYJ2JFUpQbocGO+3Ap53FRHNtZAberBZSQj+/YQmnwAzFtPBx48gGEsx2hHYxns63oI7KVUEDAJwJgiZYQh6QqTnWNjNDu1B1qYWior+Ga7AUMYfPDxb+hJ4FCjJ+nsqHcpgHXmAp1Q7DFJKDyvxd+88WcN1ANQZDn2/fnbDCJe3n4xps2oErms+Mm8gEGzdBzFzojtQET2qtjEvTSRDKGYZEcRb4qiTFWbvZ9vP3rnwLcsnrovCqM6p/KiCGbIeBbjzmR8coztnjooLzCR5VfxzK3SSOK3EXm7eTF1VRj6fvYOZBc1VGaLzGE5lzh3m3RiVFKQByTlcZ3V6F/GwxwgjKxUcHOdd+clp2OPDI2Ey2PWqXJesKJNKJROHiUeFMNS7JXaPireyf6KZSJokkq/DYbY5JAi2Gkmoqbf8Ldcsugxjq1ZU=) 2026-04-20 00:24:07.263475 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOSExonRl3wQ0G11FJISbyKbL4tWHtBjmvN7dBdV1mxsNUFTzHBY0ZEb+c1b7G3ZUl4HNpNT34Xxs6ZWCM8FgjM=) 2026-04-20 00:24:07.263483 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPbrTYpfkNzT+BTRVqO6R5xwhmXHnmG5Y8iZF6Qn8xrN) 2026-04-20 00:24:07.263490 | orchestrator | 2026-04-20 00:24:07.263498 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2026-04-20 00:24:07.263506 | orchestrator | Monday 20 April 2026 00:23:57 +0000 (0:00:00.995) 0:00:14.130 ********** 2026-04-20 00:24:07.263514 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2026-04-20 00:24:07.263522 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2026-04-20 00:24:07.263529 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2026-04-20 00:24:07.263536 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2026-04-20 00:24:07.263543 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2026-04-20 00:24:07.263551 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2026-04-20 00:24:07.263558 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2026-04-20 00:24:07.263565 | orchestrator | 2026-04-20 00:24:07.263572 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2026-04-20 00:24:07.263581 | orchestrator | Monday 20 April 2026 00:24:02 +0000 (0:00:05.185) 0:00:19.316 ********** 2026-04-20 00:24:07.263589 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2026-04-20 00:24:07.263599 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2026-04-20 00:24:07.263619 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2026-04-20 00:24:07.263627 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2026-04-20 00:24:07.263634 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2026-04-20 00:24:07.263641 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2026-04-20 00:24:07.263648 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2026-04-20 00:24:07.263655 | orchestrator | 2026-04-20 00:24:07.263662 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:07.263669 | orchestrator | Monday 20 April 2026 00:24:03 +0000 (0:00:00.165) 0:00:19.482 ********** 2026-04-20 00:24:07.263676 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJ5qJxWoBXzjis/nwZfca9qjqqUPprNJK+tedHt0n993) 2026-04-20 00:24:07.263704 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdtTgSTIECGzduy7TeEMuiPHf4Zyfcxji8JzbBo8t3MgyFDFJ1M1779It4G35S6IaJzKa0wYM1z3Fr681TiAbzEbKTsb4A1Dn0ojC+yH605XbyEGYLnlp6S/PKU1SDcvA9bdGJYFcTj+otwY6ZoQZftnqcWn0oSj1tnjBri1Fl/7FDmkXfrF5imtMs1zvl5g40+vYqXj1X8bqO4NpgjzOwo7AYcAsJ+LUlxcWYlA6wUFbh6Sj/Jh8EDrP8YSQKpnAoH9tKSP0SptnOtv9ebOeo8N0S5a/dbbAHsRWOWNUpzofd+ZMhTNXdIPhPersp3cGZyMtN84uRbAsb5o/RMgIW7JB8the3XySUzbtNjstJXmLqC6m6q06nCTsIeurB3lpiOFFzY6EEZzpv1EstBNoWHrHVhER9nhmz7dgnliFFnUDVytKE1lcs+YymaRtMo8tSzMQ1xzjozJIYsTQCZgxwsLDd0wfAzUEDTvKSIaLExALPXDZoBVMDy+ohPlFyC98=) 2026-04-20 00:24:07.263718 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL0FPsNyE8ayEFl6Gld8LioSyF+nIurXoRSXRj2WBqDNTZ1wvGBLFE1rs3Q0ZsAtX2/1VHZSR7ZMz/zbOMG/4FA=) 2026-04-20 00:24:07.263725 | orchestrator | 2026-04-20 00:24:07.263732 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:07.263740 | orchestrator | Monday 20 April 2026 00:24:04 +0000 (0:00:01.084) 0:00:20.566 ********** 2026-04-20 00:24:07.263747 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzraFKYr/vWrn0GIzECtGAkI5RzOJ5JeEuDoVlETYlidxGYUk28u/DFPW+N+dLPG3Mk42BThGQ/TCqe42PfkdnbYdNh4gvcTzNNDzQnYTAHzUcIoLTAOiJx4ag2Xghmhbaq6w4GzuVRgWJa6VOnhURtYUtFLRpvhyRV2KJvxLRPWq6hUlkpxWcTC3QrysUVwUqdcvCrlKqZxgsi/BWp2e6E9OCSKMdPS07OqYxIvt01s0ZJ//vcSGDH9BzeY2WVYRwP4FM8Rql9f9/c28rYn22xPF1QTFTF4Lh3eJp9XWndhMljOvyKGLSUDA5yPa1IFvrG3IRYbt/GpobwBqIUL0wnLZE5/kLU1RYuuwGGjBsYmmdSghVw4Ofn6zakF/gfMVk1uHACLR1g9oTjhwLLtQiyNIARdNX6lxH6/SjcRM6Rm5jkRIm0D3YZhBUnwyhQCGBoys/1Tr+ghxgVWET8Zrd7ondPbxRiFGZrMBdpylCCnk0m22n/70gsv8BGjQopxs=) 2026-04-20 00:24:07.263755 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHm8zu6dNL34zbSSKoNLtNwggkJtrspWzPWUKOhBxjSsFUiKgwhJfeOUtDcBVSqfHVSAhDkMWzar8UkyfZLZEUQ=) 2026-04-20 00:24:07.263762 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPOjWMiYTHtdb7gu6/m85xTjEZjDkh7b14jvCb49Z6Es) 2026-04-20 00:24:07.263769 | orchestrator | 2026-04-20 00:24:07.263777 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:07.263784 | orchestrator | Monday 20 April 2026 00:24:05 +0000 (0:00:01.004) 0:00:21.571 ********** 2026-04-20 00:24:07.263791 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK1RWtM+gsGNomJYlHoKAsCTS2beF/IuXWk3GuMThm99AD5z3ChiaPNpdYnqbU+vhzxfxuxSQfBLX0HdIFArkDQ=) 2026-04-20 00:24:07.263798 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDVVl/34xTMhHDZ9E+NRyTo2O0NisWIqFbnXwX4USaGrjcB5MnvDaECiD6u/0pG6+RJxHfr2E8HcInvpMDYWGFR3kt8O8QvZ2iPr9ZPMP+wObVEOjuUwvVOcWOmI5RanPfouYrx5DCLzfakedAi2dl6ELJHYU/ANAkoDKlubD5ye2GKVcF5F+24EcCchJ2Ee+qHy8HYe3tmqmSwwEF4aIpnskWgTTGQKDJHSkOhfsKYUClgbE0No5ix3XbB7ILEi2qbteVel55dzXo8iPUObkSwINOczWOYeTCB2VYbgflRY8S+H3UvCtDUlNn6okydhdygqrBNMQTrXFyfsv2Pzo9t4IYnldw0EDFMmI0vQ0rZWm3jHKh36Pgb21Ly6Q++i7yOV+BB13LJUmKVFifI+VgYb2pyMp0BLMOqe+7Eb8ryGXtENdR+Jf2cpzK+mojh9yS4iBfZ4kj4hF5UWIyalS+NsM9akFswwaLg/lTAwKt86DE3reN3goMpOXLf7D4oXc=) 2026-04-20 00:24:07.263806 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDUTqemCCeW72aN8ml8lZLhhRD31XXmCGNGgXNqRo/e9) 2026-04-20 00:24:07.263813 | orchestrator | 2026-04-20 00:24:07.263820 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:07.263828 | orchestrator | Monday 20 April 2026 00:24:06 +0000 (0:00:01.078) 0:00:22.649 ********** 2026-04-20 00:24:07.263836 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILgGQKBdKWhANtR1ROkq7QqWydur3uLI166GYXyqR1aF) 2026-04-20 00:24:07.263845 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCm2ujvuLDcGAU2suuR4Oh2VkLOnLCOqbKnMkRaB6bKW+L8hJVW3+skxtsQuo8gsNETGgP8M37Y5XuXxN9EA5rnsaX9njqWWnTmiyyKkfu6rHGtzoyRLO0MaklTvB4++uJqS9moYFmdS41ODnvvEoHFfFT2/P8ec5ubvVpj36B8UKG/O97X2tPjbN5Spl/xlyzMTnRe0q6djzB37nSu5d5rpChPMKRhc8mnpSELVazWuEOVPAugb+cSBlXTcxxhsLwH8KVqk/NJTrrV+lDRy9rkDJVho3djCo0rhcvJnSU/bw/j/UAbQov8RCgjk88IhkGyPAlvPMZ89wu0uE+cLTvmEQu69p2Gcc4SGwUzQirRXn17uM4MxXrfl9+RyDqwJ88zS6UoDmX85wl+WLQLAOL+Lalz3vZsgVUKFwn2XW9P/Ofqfovax3us1t0Dk96YDck7izQdzAWHDrkpU7WqXfxTX9KAnwBN4M5TMaj5ZPnInAghCklkVEna/jw+nnrVDOk=) 2026-04-20 00:24:07.263872 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLga5y18K4oLcBJxPb71uCpj1QDMSJEz8Bz2AoxlkAtv+52BjGsJnbpEhMj0uSohdCFlV0xKCisPH3swC39HX4A=) 2026-04-20 00:24:11.379637 | orchestrator | 2026-04-20 00:24:11.379751 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:11.379768 | orchestrator | Monday 20 April 2026 00:24:07 +0000 (0:00:01.033) 0:00:23.682 ********** 2026-04-20 00:24:11.379781 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAyvItLvkE4KpqRFcdtALD093rxk41ioZXF4xNew8DicRXrixRa09OWCppwzHz24IdBt9VMzt0fKlfIrYbBBRF0=) 2026-04-20 00:24:11.379816 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDaepzOfwGddJx1bj4LIxxsZA0Gd9YWxJ4M4oIFyooPvVMkf/0MnaZ8uwQ6Aqsn8gP6iyxPheTvLOfoe89IhbU857F28VnXGGnNeglPkQO131sjR1lgTwr5SoLRg57h9gkvlsmOUhBy5IOKqy2Dv0AerkWY/aYWOQqVRMJnWFrBsVNC/NlMnCRP3wyBHwIqy3tHAkIsMfH7dpoL8dbmCXckEb9CAkBohPkvd4KXd/FjHOw012TvxMpyqFUIZ5TXOoxHwnyZaioy7fdgRsT0MMXxm+RhwKHRTZzmDXA8CEALHX7wyimCj3oj7TpQruvtMNQMKwXrqET5taFjri3HtojDi7pLM4pIK+MD6mxOJ6n9zRDQO5J4qHboMq04NzWR0+8uUjNDKsl0z8YU9Hg6MupZT/SgXv0gNzfHRuAg7s5qlAx1ni0nA/QdKvOYR/kCAPjiqN9wtW3PU8hkiS6WH7+cuOJP0Cx7ZyZ/REVF1eE5vEuPDWs1p6oKmzAzs8oR2w8=) 2026-04-20 00:24:11.379832 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIPV4xOOW2ZMeiEi30Sy/RIhTSCQOj2R/vbAjwmTbmff) 2026-04-20 00:24:11.379844 | orchestrator | 2026-04-20 00:24:11.379855 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:11.379866 | orchestrator | Monday 20 April 2026 00:24:08 +0000 (0:00:01.078) 0:00:24.761 ********** 2026-04-20 00:24:11.379883 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAeaQcoKTMATzIHXTcUVCniMkmP8VRb8E8j3ZCq7hYbg+nJOVCIHpPKtZmm7dgk3fk7xExChWHqqWK3K5rjOR2Hze4Jm8X3oHlmAWwO+yP7gp8ihVtD1nnhmaPAzmdCS9bb2Usg6TV3n34IlULCwlm5u6lBXOikrtdkAOeD3qM9nlI8iy3l+Gnl7thTVMan/LbQMxHmm+YS6tWH3el7oGfqbqWVf7F8DoRIJbyZezIbeijbJ+7j5curWx1kTkt/xN/ZxcDh/WjxUDxhoQAnJDDaanZfQ/QOK3+O9hEsjZti/3GfUVpmRzFc0pRjKsrD04wPoKsfDpaQjnh5T2pySuYo43WdTlZjPihhBxs5R+ccrEw0yFVKgKVLO1JIJ2A5ICh8ZwUuezf13+m0D2pfZk20YwbC+su3tr2cK1IOrHH9COyk62tpw3ofu8xnDR7T/7DTTmFnPmSRUOu3ot7QZ7Wgou3TPr6C7GO+uaB9dl/MpBvq96EKmMO4KIbC1j/59s=) 2026-04-20 00:24:11.379895 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB2NYAt6hgU7ljFuN5RUzcmy7zHMqzVZaZbEh3OqifyJ) 2026-04-20 00:24:11.379906 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH4CD7ntprqzQ7j6x7Ed0UO7RTkrtNWfadgxEDB3V+gzP+0UScaGRkWpZK+qPJfLboLk4eLVksJulOfuXxrRW7A=) 2026-04-20 00:24:11.379918 | orchestrator | 2026-04-20 00:24:11.379929 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2026-04-20 00:24:11.379940 | orchestrator | Monday 20 April 2026 00:24:09 +0000 (0:00:01.037) 0:00:25.798 ********** 2026-04-20 00:24:11.379951 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDuCZIAMnA6UBGkhtUqHzjVjGiORlyrBAZ6wj0lUpQbLgnoi2RmpUBpvfDUhCZS3Ga6xnrxsvos69GFYJ2JFUpQbocGO+3Ap53FRHNtZAberBZSQj+/YQmnwAzFtPBx48gGEsx2hHYxns63oI7KVUEDAJwJgiZYQh6QqTnWNjNDu1B1qYWior+Ga7AUMYfPDxb+hJ4FCjJ+nsqHcpgHXmAp1Q7DFJKDyvxd+88WcN1ANQZDn2/fnbDCJe3n4xps2oErms+Mm8gEGzdBzFzojtQET2qtjEvTSRDKGYZEcRb4qiTFWbvZ9vP3rnwLcsnrovCqM6p/KiCGbIeBbjzmR8coztnjooLzCR5VfxzK3SSOK3EXm7eTF1VRj6fvYOZBc1VGaLzGE5lzh3m3RiVFKQByTlcZ3V6F/GwxwgjKxUcHOdd+clp2OPDI2Ey2PWqXJesKJNKJROHiUeFMNS7JXaPireyf6KZSJokkq/DYbY5JAi2Gkmoqbf8Ldcsugxjq1ZU=) 2026-04-20 00:24:11.380071 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOSExonRl3wQ0G11FJISbyKbL4tWHtBjmvN7dBdV1mxsNUFTzHBY0ZEb+c1b7G3ZUl4HNpNT34Xxs6ZWCM8FgjM=) 2026-04-20 00:24:11.380086 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPbrTYpfkNzT+BTRVqO6R5xwhmXHnmG5Y8iZF6Qn8xrN) 2026-04-20 00:24:11.380097 | orchestrator | 2026-04-20 00:24:11.380109 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2026-04-20 00:24:11.380120 | orchestrator | Monday 20 April 2026 00:24:10 +0000 (0:00:01.024) 0:00:26.822 ********** 2026-04-20 00:24:11.380132 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2026-04-20 00:24:11.380143 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-04-20 00:24:11.380154 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2026-04-20 00:24:11.380165 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2026-04-20 00:24:11.380176 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2026-04-20 00:24:11.380187 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2026-04-20 00:24:11.380198 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2026-04-20 00:24:11.380209 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:24:11.380220 | orchestrator | 2026-04-20 00:24:11.380249 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2026-04-20 00:24:11.380261 | orchestrator | Monday 20 April 2026 00:24:10 +0000 (0:00:00.179) 0:00:27.002 ********** 2026-04-20 00:24:11.380272 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:24:11.380283 | orchestrator | 2026-04-20 00:24:11.380294 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2026-04-20 00:24:11.380308 | orchestrator | Monday 20 April 2026 00:24:10 +0000 (0:00:00.047) 0:00:27.049 ********** 2026-04-20 00:24:11.380327 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:24:11.380347 | orchestrator | 2026-04-20 00:24:11.380367 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2026-04-20 00:24:11.380386 | orchestrator | Monday 20 April 2026 00:24:10 +0000 (0:00:00.051) 0:00:27.101 ********** 2026-04-20 00:24:11.380405 | orchestrator | changed: [testbed-manager] 2026-04-20 00:24:11.380424 | orchestrator | 2026-04-20 00:24:11.380442 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:24:11.380462 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2026-04-20 00:24:11.380482 | orchestrator | 2026-04-20 00:24:11.380501 | orchestrator | 2026-04-20 00:24:11.380521 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:24:11.380540 | orchestrator | Monday 20 April 2026 00:24:11 +0000 (0:00:00.485) 0:00:27.587 ********** 2026-04-20 00:24:11.380560 | orchestrator | =============================================================================== 2026-04-20 00:24:11.380580 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 6.42s 2026-04-20 00:24:11.380600 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.19s 2026-04-20 00:24:11.380621 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.27s 2026-04-20 00:24:11.380642 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.08s 2026-04-20 00:24:11.380662 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.08s 2026-04-20 00:24:11.380677 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.08s 2026-04-20 00:24:11.380695 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2026-04-20 00:24:11.380713 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2026-04-20 00:24:11.380731 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.03s 2026-04-20 00:24:11.380764 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.03s 2026-04-20 00:24:11.380783 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.02s 2026-04-20 00:24:11.380802 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.02s 2026-04-20 00:24:11.380821 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.00s 2026-04-20 00:24:11.380840 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.00s 2026-04-20 00:24:11.380858 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 0.99s 2026-04-20 00:24:11.380878 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 0.99s 2026-04-20 00:24:11.380897 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.49s 2026-04-20 00:24:11.380916 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.18s 2026-04-20 00:24:11.380934 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.17s 2026-04-20 00:24:11.380954 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.17s 2026-04-20 00:24:11.550456 | orchestrator | + osism apply squid 2026-04-20 00:24:22.829360 | orchestrator | 2026-04-20 00:24:22 | INFO  | Prepare task for execution of squid. 2026-04-20 00:24:22.896583 | orchestrator | 2026-04-20 00:24:22 | INFO  | Task e4dc3860-fadf-44cb-bbe9-7878862d8564 (squid) was prepared for execution. 2026-04-20 00:24:22.896673 | orchestrator | 2026-04-20 00:24:22 | INFO  | It takes a moment until task e4dc3860-fadf-44cb-bbe9-7878862d8564 (squid) has been started and output is visible here. 2026-04-20 00:26:27.452315 | orchestrator | 2026-04-20 00:26:27.452375 | orchestrator | PLAY [Apply role squid] ******************************************************** 2026-04-20 00:26:27.452381 | orchestrator | 2026-04-20 00:26:27.452385 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2026-04-20 00:26:27.452390 | orchestrator | Monday 20 April 2026 00:24:25 +0000 (0:00:00.188) 0:00:00.188 ********** 2026-04-20 00:26:27.452394 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2026-04-20 00:26:27.452398 | orchestrator | 2026-04-20 00:26:27.452402 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2026-04-20 00:26:27.452406 | orchestrator | Monday 20 April 2026 00:24:26 +0000 (0:00:00.077) 0:00:00.266 ********** 2026-04-20 00:26:27.452410 | orchestrator | ok: [testbed-manager] 2026-04-20 00:26:27.452414 | orchestrator | 2026-04-20 00:26:27.452418 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2026-04-20 00:26:27.452422 | orchestrator | Monday 20 April 2026 00:24:28 +0000 (0:00:02.335) 0:00:02.602 ********** 2026-04-20 00:26:27.452426 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2026-04-20 00:26:27.452439 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2026-04-20 00:26:27.452443 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2026-04-20 00:26:27.452447 | orchestrator | 2026-04-20 00:26:27.452451 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2026-04-20 00:26:27.452454 | orchestrator | Monday 20 April 2026 00:24:29 +0000 (0:00:01.207) 0:00:03.809 ********** 2026-04-20 00:26:27.452458 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2026-04-20 00:26:27.452462 | orchestrator | 2026-04-20 00:26:27.452466 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2026-04-20 00:26:27.452470 | orchestrator | Monday 20 April 2026 00:24:30 +0000 (0:00:01.026) 0:00:04.835 ********** 2026-04-20 00:26:27.452474 | orchestrator | ok: [testbed-manager] 2026-04-20 00:26:27.452478 | orchestrator | 2026-04-20 00:26:27.452481 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2026-04-20 00:26:27.452511 | orchestrator | Monday 20 April 2026 00:24:30 +0000 (0:00:00.339) 0:00:05.175 ********** 2026-04-20 00:26:27.452526 | orchestrator | changed: [testbed-manager] 2026-04-20 00:26:27.452532 | orchestrator | 2026-04-20 00:26:27.452536 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2026-04-20 00:26:27.452540 | orchestrator | Monday 20 April 2026 00:24:31 +0000 (0:00:00.890) 0:00:06.065 ********** 2026-04-20 00:26:27.452543 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2026-04-20 00:26:27.452547 | orchestrator | ok: [testbed-manager] 2026-04-20 00:26:27.452551 | orchestrator | 2026-04-20 00:26:27.452555 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2026-04-20 00:26:27.452558 | orchestrator | Monday 20 April 2026 00:25:14 +0000 (0:00:42.506) 0:00:48.572 ********** 2026-04-20 00:26:27.452562 | orchestrator | changed: [testbed-manager] 2026-04-20 00:26:27.452566 | orchestrator | 2026-04-20 00:26:27.452569 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2026-04-20 00:26:27.452573 | orchestrator | Monday 20 April 2026 00:25:26 +0000 (0:00:12.065) 0:01:00.637 ********** 2026-04-20 00:26:27.452577 | orchestrator | Pausing for 60 seconds 2026-04-20 00:26:27.452583 | orchestrator | changed: [testbed-manager] 2026-04-20 00:26:27.452587 | orchestrator | 2026-04-20 00:26:27.452591 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2026-04-20 00:26:27.452594 | orchestrator | Monday 20 April 2026 00:26:26 +0000 (0:01:00.092) 0:02:00.729 ********** 2026-04-20 00:26:27.452598 | orchestrator | ok: [testbed-manager] 2026-04-20 00:26:27.452602 | orchestrator | 2026-04-20 00:26:27.452605 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2026-04-20 00:26:27.452609 | orchestrator | Monday 20 April 2026 00:26:26 +0000 (0:00:00.052) 0:02:00.781 ********** 2026-04-20 00:26:27.452613 | orchestrator | changed: [testbed-manager] 2026-04-20 00:26:27.452616 | orchestrator | 2026-04-20 00:26:27.452620 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:26:27.452624 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:26:27.452627 | orchestrator | 2026-04-20 00:26:27.452631 | orchestrator | 2026-04-20 00:26:27.452635 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:26:27.452638 | orchestrator | Monday 20 April 2026 00:26:27 +0000 (0:00:00.636) 0:02:01.418 ********** 2026-04-20 00:26:27.452642 | orchestrator | =============================================================================== 2026-04-20 00:26:27.452646 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.09s 2026-04-20 00:26:27.452649 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 42.51s 2026-04-20 00:26:27.452653 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.07s 2026-04-20 00:26:27.452657 | orchestrator | osism.services.squid : Install required packages ------------------------ 2.34s 2026-04-20 00:26:27.452660 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.21s 2026-04-20 00:26:27.452664 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.03s 2026-04-20 00:26:27.452668 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 0.89s 2026-04-20 00:26:27.452671 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.64s 2026-04-20 00:26:27.452675 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.34s 2026-04-20 00:26:27.452679 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.08s 2026-04-20 00:26:27.452682 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.05s 2026-04-20 00:26:27.657506 | orchestrator | + [[ 10.0.0 != \l\a\t\e\s\t ]] 2026-04-20 00:26:27.659142 | orchestrator | ++ semver 10.0.0 10.0.0-0 2026-04-20 00:26:27.745254 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-20 00:26:27.745336 | orchestrator | + /opt/configuration/scripts/set-kolla-namespace.sh kolla/release/2024.2 2026-04-20 00:26:27.753129 | orchestrator | + set -e 2026-04-20 00:26:27.753375 | orchestrator | + NAMESPACE=kolla/release/2024.2 2026-04-20 00:26:27.753408 | orchestrator | + sed -i 's#docker_namespace: .*#docker_namespace: kolla/release/2024.2#g' /opt/configuration/inventory/group_vars/all/kolla.yml 2026-04-20 00:26:27.761384 | orchestrator | ++ semver 10.0.0 9.0.0 2026-04-20 00:26:27.824395 | orchestrator | + [[ 1 -lt 0 ]] 2026-04-20 00:26:27.824813 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2026-04-20 00:26:39.279949 | orchestrator | 2026-04-20 00:26:39 | INFO  | Prepare task for execution of operator. 2026-04-20 00:26:39.349294 | orchestrator | 2026-04-20 00:26:39 | INFO  | Task 427a8c73-b0a7-413b-930e-a97b408a8d13 (operator) was prepared for execution. 2026-04-20 00:26:39.349365 | orchestrator | 2026-04-20 00:26:39 | INFO  | It takes a moment until task 427a8c73-b0a7-413b-930e-a97b408a8d13 (operator) has been started and output is visible here. 2026-04-20 00:26:56.340299 | orchestrator | 2026-04-20 00:26:56.341178 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2026-04-20 00:26:56.341211 | orchestrator | 2026-04-20 00:26:56.341223 | orchestrator | TASK [Gathering Facts] ********************************************************* 2026-04-20 00:26:56.341233 | orchestrator | Monday 20 April 2026 00:26:42 +0000 (0:00:00.210) 0:00:00.210 ********** 2026-04-20 00:26:56.341243 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:26:56.341255 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:26:56.341264 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:26:56.341274 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:26:56.341283 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:26:56.341293 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:26:56.341302 | orchestrator | 2026-04-20 00:26:56.341312 | orchestrator | TASK [Do not require tty for all users] **************************************** 2026-04-20 00:26:56.341322 | orchestrator | Monday 20 April 2026 00:26:47 +0000 (0:00:04.625) 0:00:04.835 ********** 2026-04-20 00:26:56.341331 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:26:56.341340 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:26:56.341350 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:26:56.341359 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:26:56.341369 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:26:56.341378 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:26:56.341387 | orchestrator | 2026-04-20 00:26:56.341397 | orchestrator | PLAY [Apply role operator] ***************************************************** 2026-04-20 00:26:56.341406 | orchestrator | 2026-04-20 00:26:56.341416 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2026-04-20 00:26:56.341426 | orchestrator | Monday 20 April 2026 00:26:48 +0000 (0:00:00.882) 0:00:05.718 ********** 2026-04-20 00:26:56.341436 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:26:56.341445 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:26:56.341455 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:26:56.341464 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:26:56.341474 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:26:56.341483 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:26:56.341493 | orchestrator | 2026-04-20 00:26:56.341502 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2026-04-20 00:26:56.341512 | orchestrator | Monday 20 April 2026 00:26:48 +0000 (0:00:00.187) 0:00:05.906 ********** 2026-04-20 00:26:56.341521 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:26:56.341531 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:26:56.341540 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:26:56.341550 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:26:56.341559 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:26:56.341569 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:26:56.341578 | orchestrator | 2026-04-20 00:26:56.341588 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2026-04-20 00:26:56.341597 | orchestrator | Monday 20 April 2026 00:26:48 +0000 (0:00:00.218) 0:00:06.125 ********** 2026-04-20 00:26:56.341607 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:26:56.341618 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:26:56.341627 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:26:56.341664 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:26:56.341674 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:26:56.341684 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:26:56.341693 | orchestrator | 2026-04-20 00:26:56.341703 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2026-04-20 00:26:56.341712 | orchestrator | Monday 20 April 2026 00:26:49 +0000 (0:00:00.805) 0:00:06.930 ********** 2026-04-20 00:26:56.341722 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:26:56.341731 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:26:56.341741 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:26:56.341750 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:26:56.341759 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:26:56.341769 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:26:56.341778 | orchestrator | 2026-04-20 00:26:56.341788 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2026-04-20 00:26:56.341797 | orchestrator | Monday 20 April 2026 00:26:50 +0000 (0:00:00.833) 0:00:07.764 ********** 2026-04-20 00:26:56.341807 | orchestrator | changed: [testbed-node-0] => (item=adm) 2026-04-20 00:26:56.341816 | orchestrator | changed: [testbed-node-1] => (item=adm) 2026-04-20 00:26:56.341826 | orchestrator | changed: [testbed-node-3] => (item=adm) 2026-04-20 00:26:56.341835 | orchestrator | changed: [testbed-node-4] => (item=adm) 2026-04-20 00:26:56.341844 | orchestrator | changed: [testbed-node-2] => (item=adm) 2026-04-20 00:26:56.341854 | orchestrator | changed: [testbed-node-5] => (item=adm) 2026-04-20 00:26:56.341863 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2026-04-20 00:26:56.341873 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2026-04-20 00:26:56.341882 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2026-04-20 00:26:56.341891 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2026-04-20 00:26:56.341901 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2026-04-20 00:26:56.341910 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2026-04-20 00:26:56.341919 | orchestrator | 2026-04-20 00:26:56.341929 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2026-04-20 00:26:56.341939 | orchestrator | Monday 20 April 2026 00:26:51 +0000 (0:00:01.092) 0:00:08.857 ********** 2026-04-20 00:26:56.341948 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:26:56.341958 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:26:56.341967 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:26:56.341977 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:26:56.341986 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:26:56.342069 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:26:56.342084 | orchestrator | 2026-04-20 00:26:56.342094 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2026-04-20 00:26:56.342105 | orchestrator | Monday 20 April 2026 00:26:52 +0000 (0:00:01.375) 0:00:10.232 ********** 2026-04-20 00:26:56.342115 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:26:56.342125 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:26:56.342134 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:26:56.342144 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:26:56.342154 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:26:56.342200 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2026-04-20 00:26:56.342212 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2026-04-20 00:26:56.342221 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2026-04-20 00:26:56.342231 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2026-04-20 00:26:56.342241 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2026-04-20 00:26:56.342250 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2026-04-20 00:26:56.342260 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2026-04-20 00:26:56.342278 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:26:56.342288 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2026-04-20 00:26:56.342298 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2026-04-20 00:26:56.342307 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2026-04-20 00:26:56.342317 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:26:56.342326 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:26:56.342335 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:26:56.342345 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:26:56.342355 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2026-04-20 00:26:56.342364 | orchestrator | 2026-04-20 00:26:56.342374 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2026-04-20 00:26:56.342385 | orchestrator | Monday 20 April 2026 00:26:54 +0000 (0:00:01.245) 0:00:11.478 ********** 2026-04-20 00:26:56.342394 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:56.342404 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:56.342414 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:56.342423 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:56.342433 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:56.342448 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:56.342457 | orchestrator | 2026-04-20 00:26:56.342467 | orchestrator | TASK [osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file] *** 2026-04-20 00:26:56.342524 | orchestrator | Monday 20 April 2026 00:26:54 +0000 (0:00:00.164) 0:00:11.643 ********** 2026-04-20 00:26:56.342535 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:56.342545 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:56.342555 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:56.342564 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:56.342574 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:56.342584 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:56.342593 | orchestrator | 2026-04-20 00:26:56.342603 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2026-04-20 00:26:56.342613 | orchestrator | Monday 20 April 2026 00:26:54 +0000 (0:00:00.201) 0:00:11.844 ********** 2026-04-20 00:26:56.342623 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:26:56.342632 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:26:56.342642 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:26:56.342651 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:26:56.342661 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:26:56.342670 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:26:56.342680 | orchestrator | 2026-04-20 00:26:56.342690 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2026-04-20 00:26:56.342699 | orchestrator | Monday 20 April 2026 00:26:55 +0000 (0:00:00.582) 0:00:12.427 ********** 2026-04-20 00:26:56.342709 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:56.342748 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:56.342760 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:56.342770 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:56.342779 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:56.342789 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:56.342798 | orchestrator | 2026-04-20 00:26:56.342808 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2026-04-20 00:26:56.342817 | orchestrator | Monday 20 April 2026 00:26:55 +0000 (0:00:00.210) 0:00:12.637 ********** 2026-04-20 00:26:56.342827 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-20 00:26:56.342837 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:26:56.342846 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-20 00:26:56.342863 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:26:56.342872 | orchestrator | changed: [testbed-node-2] => (item=None) 2026-04-20 00:26:56.342882 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-20 00:26:56.342891 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:26:56.342901 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:26:56.342911 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-04-20 00:26:56.342920 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:26:56.342930 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-20 00:26:56.342939 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:26:56.342949 | orchestrator | 2026-04-20 00:26:56.342959 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2026-04-20 00:26:56.342968 | orchestrator | Monday 20 April 2026 00:26:56 +0000 (0:00:00.754) 0:00:13.391 ********** 2026-04-20 00:26:56.342978 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:56.342988 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:56.343020 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:56.343031 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:56.343041 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:56.343051 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:56.343060 | orchestrator | 2026-04-20 00:26:56.343070 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2026-04-20 00:26:56.343079 | orchestrator | Monday 20 April 2026 00:26:56 +0000 (0:00:00.145) 0:00:13.537 ********** 2026-04-20 00:26:56.343089 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:56.343099 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:56.343108 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:56.343118 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:56.343135 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:57.619182 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:57.619270 | orchestrator | 2026-04-20 00:26:57.619282 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2026-04-20 00:26:57.619291 | orchestrator | Monday 20 April 2026 00:26:56 +0000 (0:00:00.142) 0:00:13.679 ********** 2026-04-20 00:26:57.619299 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:57.619306 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:57.619314 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:57.619321 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:57.619328 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:57.619335 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:57.619342 | orchestrator | 2026-04-20 00:26:57.619350 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2026-04-20 00:26:57.619357 | orchestrator | Monday 20 April 2026 00:26:56 +0000 (0:00:00.136) 0:00:13.816 ********** 2026-04-20 00:26:57.619364 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:26:57.619371 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:26:57.619378 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:26:57.619385 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:26:57.619392 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:26:57.619399 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:26:57.619407 | orchestrator | 2026-04-20 00:26:57.619414 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2026-04-20 00:26:57.619421 | orchestrator | Monday 20 April 2026 00:26:57 +0000 (0:00:00.736) 0:00:14.552 ********** 2026-04-20 00:26:57.619428 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:26:57.619435 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:26:57.619442 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:26:57.619449 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:26:57.619456 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:26:57.619463 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:26:57.619470 | orchestrator | 2026-04-20 00:26:57.619477 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:26:57.619485 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-20 00:26:57.619518 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-20 00:26:57.619526 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-20 00:26:57.619533 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-20 00:26:57.619540 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-20 00:26:57.619547 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=7  rescued=0 ignored=0 2026-04-20 00:26:57.619554 | orchestrator | 2026-04-20 00:26:57.619562 | orchestrator | 2026-04-20 00:26:57.619569 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:26:57.619576 | orchestrator | Monday 20 April 2026 00:26:57 +0000 (0:00:00.206) 0:00:14.759 ********** 2026-04-20 00:26:57.619583 | orchestrator | =============================================================================== 2026-04-20 00:26:57.619590 | orchestrator | Gathering Facts --------------------------------------------------------- 4.63s 2026-04-20 00:26:57.619597 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.38s 2026-04-20 00:26:57.619604 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.25s 2026-04-20 00:26:57.619611 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.09s 2026-04-20 00:26:57.619618 | orchestrator | Do not require tty for all users ---------------------------------------- 0.88s 2026-04-20 00:26:57.619625 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.83s 2026-04-20 00:26:57.619635 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.81s 2026-04-20 00:26:57.619648 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.75s 2026-04-20 00:26:57.619661 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.74s 2026-04-20 00:26:57.619674 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.58s 2026-04-20 00:26:57.619687 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.22s 2026-04-20 00:26:57.619700 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.21s 2026-04-20 00:26:57.619713 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.21s 2026-04-20 00:26:57.619726 | orchestrator | osism.commons.operator : Set custom PS1 prompt in .bashrc configuration file --- 0.20s 2026-04-20 00:26:57.619739 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.19s 2026-04-20 00:26:57.619750 | orchestrator | osism.commons.operator : Set custom environment variables in .bashrc configuration file --- 0.16s 2026-04-20 00:26:57.619758 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.15s 2026-04-20 00:26:57.619766 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.14s 2026-04-20 00:26:57.619774 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.14s 2026-04-20 00:26:57.803290 | orchestrator | + osism apply --environment custom facts 2026-04-20 00:26:59.063421 | orchestrator | 2026-04-20 00:26:59 | INFO  | Trying to run play facts in environment custom 2026-04-20 00:27:09.164471 | orchestrator | 2026-04-20 00:27:09 | INFO  | Prepare task for execution of facts. 2026-04-20 00:27:09.242140 | orchestrator | 2026-04-20 00:27:09 | INFO  | Task e6310c94-0992-4424-953c-2ba32e720dd6 (facts) was prepared for execution. 2026-04-20 00:27:09.242234 | orchestrator | 2026-04-20 00:27:09 | INFO  | It takes a moment until task e6310c94-0992-4424-953c-2ba32e720dd6 (facts) has been started and output is visible here. 2026-04-20 00:27:54.896138 | orchestrator | 2026-04-20 00:27:54.896244 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2026-04-20 00:27:54.896255 | orchestrator | 2026-04-20 00:27:54.896262 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-04-20 00:27:54.896270 | orchestrator | Monday 20 April 2026 00:27:12 +0000 (0:00:00.118) 0:00:00.118 ********** 2026-04-20 00:27:54.896277 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:27:54.896322 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:27:54.896327 | orchestrator | ok: [testbed-manager] 2026-04-20 00:27:54.896333 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:27:54.896337 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:27:54.896341 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:27:54.896345 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:27:54.896349 | orchestrator | 2026-04-20 00:27:54.896353 | orchestrator | TASK [Copy fact file] ********************************************************** 2026-04-20 00:27:54.896357 | orchestrator | Monday 20 April 2026 00:27:13 +0000 (0:00:01.506) 0:00:01.625 ********** 2026-04-20 00:27:54.896361 | orchestrator | ok: [testbed-manager] 2026-04-20 00:27:54.896365 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:27:54.896369 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:27:54.896373 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:27:54.896376 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:27:54.896380 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:27:54.896384 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:27:54.896388 | orchestrator | 2026-04-20 00:27:54.896394 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2026-04-20 00:27:54.896398 | orchestrator | 2026-04-20 00:27:54.896402 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-04-20 00:27:54.896405 | orchestrator | Monday 20 April 2026 00:27:15 +0000 (0:00:01.360) 0:00:02.986 ********** 2026-04-20 00:27:54.896409 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896413 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896417 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896420 | orchestrator | 2026-04-20 00:27:54.896424 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-04-20 00:27:54.896429 | orchestrator | Monday 20 April 2026 00:27:15 +0000 (0:00:00.102) 0:00:03.088 ********** 2026-04-20 00:27:54.896433 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896436 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896440 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896444 | orchestrator | 2026-04-20 00:27:54.896447 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-04-20 00:27:54.896452 | orchestrator | Monday 20 April 2026 00:27:15 +0000 (0:00:00.204) 0:00:03.294 ********** 2026-04-20 00:27:54.896455 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896459 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896463 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896466 | orchestrator | 2026-04-20 00:27:54.896470 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-04-20 00:27:54.896474 | orchestrator | Monday 20 April 2026 00:27:15 +0000 (0:00:00.201) 0:00:03.495 ********** 2026-04-20 00:27:54.896479 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:27:54.896484 | orchestrator | 2026-04-20 00:27:54.896487 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-04-20 00:27:54.896491 | orchestrator | Monday 20 April 2026 00:27:15 +0000 (0:00:00.125) 0:00:03.621 ********** 2026-04-20 00:27:54.896495 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896499 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896502 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896522 | orchestrator | 2026-04-20 00:27:54.896526 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-04-20 00:27:54.896530 | orchestrator | Monday 20 April 2026 00:27:16 +0000 (0:00:00.491) 0:00:04.113 ********** 2026-04-20 00:27:54.896534 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:27:54.896537 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:27:54.896541 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:27:54.896545 | orchestrator | 2026-04-20 00:27:54.896549 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-04-20 00:27:54.896552 | orchestrator | Monday 20 April 2026 00:27:16 +0000 (0:00:00.109) 0:00:04.222 ********** 2026-04-20 00:27:54.896557 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:27:54.896563 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:27:54.896569 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:27:54.896575 | orchestrator | 2026-04-20 00:27:54.896581 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-04-20 00:27:54.896587 | orchestrator | Monday 20 April 2026 00:27:17 +0000 (0:00:01.033) 0:00:05.256 ********** 2026-04-20 00:27:54.896593 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896599 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896605 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896610 | orchestrator | 2026-04-20 00:27:54.896616 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-04-20 00:27:54.896622 | orchestrator | Monday 20 April 2026 00:27:17 +0000 (0:00:00.440) 0:00:05.697 ********** 2026-04-20 00:27:54.896629 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:27:54.896635 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:27:54.896641 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:27:54.896647 | orchestrator | 2026-04-20 00:27:54.896654 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-04-20 00:27:54.896660 | orchestrator | Monday 20 April 2026 00:27:18 +0000 (0:00:00.844) 0:00:06.541 ********** 2026-04-20 00:27:54.896667 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:27:54.896673 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:27:54.896679 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:27:54.896686 | orchestrator | 2026-04-20 00:27:54.896692 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2026-04-20 00:27:54.896699 | orchestrator | Monday 20 April 2026 00:27:36 +0000 (0:00:17.450) 0:00:23.992 ********** 2026-04-20 00:27:54.896704 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:27:54.896711 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:27:54.896717 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:27:54.896723 | orchestrator | 2026-04-20 00:27:54.896729 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2026-04-20 00:27:54.896752 | orchestrator | Monday 20 April 2026 00:27:36 +0000 (0:00:00.088) 0:00:24.080 ********** 2026-04-20 00:27:54.896759 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:27:54.896766 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:27:54.896772 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:27:54.896778 | orchestrator | 2026-04-20 00:27:54.896785 | orchestrator | TASK [Create custom facts directory] ******************************************* 2026-04-20 00:27:54.896792 | orchestrator | Monday 20 April 2026 00:27:45 +0000 (0:00:09.403) 0:00:33.484 ********** 2026-04-20 00:27:54.896798 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896804 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896811 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896815 | orchestrator | 2026-04-20 00:27:54.896819 | orchestrator | TASK [Copy fact files] ********************************************************* 2026-04-20 00:27:54.896822 | orchestrator | Monday 20 April 2026 00:27:46 +0000 (0:00:00.445) 0:00:33.929 ********** 2026-04-20 00:27:54.896826 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2026-04-20 00:27:54.896830 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2026-04-20 00:27:54.896834 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2026-04-20 00:27:54.896844 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2026-04-20 00:27:54.896851 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2026-04-20 00:27:54.896855 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2026-04-20 00:27:54.896858 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2026-04-20 00:27:54.896862 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2026-04-20 00:27:54.896866 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2026-04-20 00:27:54.896870 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2026-04-20 00:27:54.896873 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2026-04-20 00:27:54.896877 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2026-04-20 00:27:54.896881 | orchestrator | 2026-04-20 00:27:54.896884 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-04-20 00:27:54.896888 | orchestrator | Monday 20 April 2026 00:27:49 +0000 (0:00:03.733) 0:00:37.662 ********** 2026-04-20 00:27:54.896892 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896896 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896899 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896903 | orchestrator | 2026-04-20 00:27:54.896907 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-20 00:27:54.896911 | orchestrator | 2026-04-20 00:27:54.896914 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-20 00:27:54.896918 | orchestrator | Monday 20 April 2026 00:27:51 +0000 (0:00:01.401) 0:00:39.064 ********** 2026-04-20 00:27:54.896922 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:27:54.896925 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:27:54.896929 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:27:54.896933 | orchestrator | ok: [testbed-manager] 2026-04-20 00:27:54.896936 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:27:54.896940 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:27:54.896944 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:27:54.896947 | orchestrator | 2026-04-20 00:27:54.896951 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:27:54.896956 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:27:54.896961 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:27:54.896966 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:27:54.896970 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:27:54.896974 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:27:54.896978 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:27:54.896982 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:27:54.896986 | orchestrator | 2026-04-20 00:27:54.896989 | orchestrator | 2026-04-20 00:27:54.897015 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:27:54.897020 | orchestrator | Monday 20 April 2026 00:27:54 +0000 (0:00:03.632) 0:00:42.697 ********** 2026-04-20 00:27:54.897024 | orchestrator | =============================================================================== 2026-04-20 00:27:54.897028 | orchestrator | osism.commons.repository : Update package cache ------------------------ 17.45s 2026-04-20 00:27:54.897037 | orchestrator | Install required packages (Debian) -------------------------------------- 9.40s 2026-04-20 00:27:54.897043 | orchestrator | Copy fact files --------------------------------------------------------- 3.73s 2026-04-20 00:27:54.897051 | orchestrator | Gathers facts about hosts ----------------------------------------------- 3.63s 2026-04-20 00:27:54.897060 | orchestrator | Create custom facts directory ------------------------------------------- 1.51s 2026-04-20 00:27:54.897071 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.40s 2026-04-20 00:27:54.897081 | orchestrator | Copy fact file ---------------------------------------------------------- 1.36s 2026-04-20 00:27:55.057191 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 1.03s 2026-04-20 00:27:55.057271 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 0.84s 2026-04-20 00:27:55.057279 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.49s 2026-04-20 00:27:55.057285 | orchestrator | Create custom facts directory ------------------------------------------- 0.45s 2026-04-20 00:27:55.057290 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.44s 2026-04-20 00:27:55.057295 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.21s 2026-04-20 00:27:55.057300 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.20s 2026-04-20 00:27:55.057305 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.13s 2026-04-20 00:27:55.057310 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.11s 2026-04-20 00:27:55.057315 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.10s 2026-04-20 00:27:55.057320 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.09s 2026-04-20 00:27:55.223777 | orchestrator | + osism apply bootstrap 2026-04-20 00:28:06.554105 | orchestrator | 2026-04-20 00:28:06 | INFO  | Prepare task for execution of bootstrap. 2026-04-20 00:28:06.640819 | orchestrator | 2026-04-20 00:28:06 | INFO  | Task 6cfcc6d4-b4ed-45d1-a737-0edcd32f6909 (bootstrap) was prepared for execution. 2026-04-20 00:28:06.640937 | orchestrator | 2026-04-20 00:28:06 | INFO  | It takes a moment until task 6cfcc6d4-b4ed-45d1-a737-0edcd32f6909 (bootstrap) has been started and output is visible here. 2026-04-20 00:28:21.701265 | orchestrator | 2026-04-20 00:28:21.701378 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2026-04-20 00:28:21.701395 | orchestrator | 2026-04-20 00:28:21.701407 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2026-04-20 00:28:21.701419 | orchestrator | Monday 20 April 2026 00:28:09 +0000 (0:00:00.141) 0:00:00.141 ********** 2026-04-20 00:28:21.701431 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:21.701443 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:21.701454 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:21.701465 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:21.701476 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:21.701486 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:21.701497 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:21.701507 | orchestrator | 2026-04-20 00:28:21.701518 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-20 00:28:21.701529 | orchestrator | 2026-04-20 00:28:21.701540 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-20 00:28:21.701550 | orchestrator | Monday 20 April 2026 00:28:09 +0000 (0:00:00.213) 0:00:00.355 ********** 2026-04-20 00:28:21.701561 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:21.701571 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:21.701582 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:21.701592 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:21.701603 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:21.701635 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:21.701646 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:21.701657 | orchestrator | 2026-04-20 00:28:21.701667 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2026-04-20 00:28:21.701700 | orchestrator | 2026-04-20 00:28:21.701712 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-20 00:28:21.701722 | orchestrator | Monday 20 April 2026 00:28:14 +0000 (0:00:04.904) 0:00:05.260 ********** 2026-04-20 00:28:21.701735 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2026-04-20 00:28:21.701747 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-04-20 00:28:21.701760 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2026-04-20 00:28:21.701772 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2026-04-20 00:28:21.701785 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:28:21.701797 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2026-04-20 00:28:21.701809 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2026-04-20 00:28:21.701822 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2026-04-20 00:28:21.701835 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:28:21.701847 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-04-20 00:28:21.701859 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2026-04-20 00:28:21.701871 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-04-20 00:28:21.701883 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2026-04-20 00:28:21.701896 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2026-04-20 00:28:21.701907 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:28:21.701919 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-04-20 00:28:21.701931 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-04-20 00:28:21.701944 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:21.701955 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2026-04-20 00:28:21.701968 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2026-04-20 00:28:21.701980 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2026-04-20 00:28:21.702106 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-04-20 00:28:21.702119 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2026-04-20 00:28:21.702129 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-04-20 00:28:21.702139 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2026-04-20 00:28:21.702150 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2026-04-20 00:28:21.702161 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-20 00:28:21.702171 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2026-04-20 00:28:21.702182 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:28:21.702192 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2026-04-20 00:28:21.702203 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-04-20 00:28:21.702213 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-20 00:28:21.702223 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2026-04-20 00:28:21.702234 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2026-04-20 00:28:21.702244 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:28:21.702255 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2026-04-20 00:28:21.702265 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-04-20 00:28:21.702276 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2026-04-20 00:28:21.702286 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:28:21.702303 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-20 00:28:21.702314 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-04-20 00:28:21.702324 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-04-20 00:28:21.702344 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:28:21.702354 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-04-20 00:28:21.702365 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2026-04-20 00:28:21.702376 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:28:21.702406 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-04-20 00:28:21.702418 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2026-04-20 00:28:21.702428 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:28:21.702439 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:28:21.702450 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2026-04-20 00:28:21.702461 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2026-04-20 00:28:21.702471 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:28:21.702482 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2026-04-20 00:28:21.702493 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2026-04-20 00:28:21.702503 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:28:21.702514 | orchestrator | 2026-04-20 00:28:21.702525 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2026-04-20 00:28:21.702536 | orchestrator | 2026-04-20 00:28:21.702547 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2026-04-20 00:28:21.702558 | orchestrator | Monday 20 April 2026 00:28:15 +0000 (0:00:00.400) 0:00:05.660 ********** 2026-04-20 00:28:21.702569 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:21.702580 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:21.702590 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:21.702601 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:21.702612 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:21.702622 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:21.702633 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:21.702643 | orchestrator | 2026-04-20 00:28:21.702654 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2026-04-20 00:28:21.702665 | orchestrator | Monday 20 April 2026 00:28:16 +0000 (0:00:01.179) 0:00:06.839 ********** 2026-04-20 00:28:21.702676 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:21.702686 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:21.702697 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:21.702707 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:21.702718 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:21.702728 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:21.702739 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:21.702749 | orchestrator | 2026-04-20 00:28:21.702760 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2026-04-20 00:28:21.702771 | orchestrator | Monday 20 April 2026 00:28:17 +0000 (0:00:01.127) 0:00:07.967 ********** 2026-04-20 00:28:21.702783 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:28:21.702796 | orchestrator | 2026-04-20 00:28:21.702807 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2026-04-20 00:28:21.702818 | orchestrator | Monday 20 April 2026 00:28:17 +0000 (0:00:00.265) 0:00:08.233 ********** 2026-04-20 00:28:21.702829 | orchestrator | changed: [testbed-manager] 2026-04-20 00:28:21.702840 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:21.702850 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:21.702861 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:21.702872 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:28:21.702882 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:28:21.702893 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:28:21.702904 | orchestrator | 2026-04-20 00:28:21.702915 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2026-04-20 00:28:21.702932 | orchestrator | Monday 20 April 2026 00:28:19 +0000 (0:00:01.436) 0:00:09.669 ********** 2026-04-20 00:28:21.702943 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:21.702955 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:28:21.702968 | orchestrator | 2026-04-20 00:28:21.702978 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2026-04-20 00:28:21.703238 | orchestrator | Monday 20 April 2026 00:28:19 +0000 (0:00:00.320) 0:00:09.989 ********** 2026-04-20 00:28:21.703254 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:21.703265 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:21.703276 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:28:21.703286 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:21.703297 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:28:21.703307 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:28:21.703317 | orchestrator | 2026-04-20 00:28:21.703328 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2026-04-20 00:28:21.703339 | orchestrator | Monday 20 April 2026 00:28:20 +0000 (0:00:01.011) 0:00:11.001 ********** 2026-04-20 00:28:21.703349 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:21.703360 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:21.703370 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:28:21.703381 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:21.703391 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:28:21.703402 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:28:21.703412 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:21.703423 | orchestrator | 2026-04-20 00:28:21.703433 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2026-04-20 00:28:21.703444 | orchestrator | Monday 20 April 2026 00:28:21 +0000 (0:00:00.571) 0:00:11.573 ********** 2026-04-20 00:28:21.703454 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:28:21.703473 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:28:21.703484 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:28:21.703494 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:28:21.703504 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:28:21.703515 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:28:21.703526 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:21.703536 | orchestrator | 2026-04-20 00:28:21.703547 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2026-04-20 00:28:21.703559 | orchestrator | Monday 20 April 2026 00:28:21 +0000 (0:00:00.446) 0:00:12.019 ********** 2026-04-20 00:28:21.703569 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:21.703580 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:28:21.703602 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:28:33.580070 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:28:33.580170 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:28:33.580186 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:28:33.580192 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:28:33.580198 | orchestrator | 2026-04-20 00:28:33.580206 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2026-04-20 00:28:33.580215 | orchestrator | Monday 20 April 2026 00:28:21 +0000 (0:00:00.212) 0:00:12.232 ********** 2026-04-20 00:28:33.580223 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:28:33.580243 | orchestrator | 2026-04-20 00:28:33.580249 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2026-04-20 00:28:33.580254 | orchestrator | Monday 20 April 2026 00:28:22 +0000 (0:00:00.298) 0:00:12.531 ********** 2026-04-20 00:28:33.580262 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:28:33.580283 | orchestrator | 2026-04-20 00:28:33.580288 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2026-04-20 00:28:33.580292 | orchestrator | Monday 20 April 2026 00:28:22 +0000 (0:00:00.291) 0:00:12.823 ********** 2026-04-20 00:28:33.580296 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580301 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580305 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580308 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580312 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.580316 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.580320 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.580325 | orchestrator | 2026-04-20 00:28:33.580331 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2026-04-20 00:28:33.580337 | orchestrator | Monday 20 April 2026 00:28:23 +0000 (0:00:01.566) 0:00:14.389 ********** 2026-04-20 00:28:33.580346 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:33.580353 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:28:33.580362 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:28:33.580368 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:28:33.580373 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:28:33.580379 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:28:33.580385 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:28:33.580391 | orchestrator | 2026-04-20 00:28:33.580408 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2026-04-20 00:28:33.580420 | orchestrator | Monday 20 April 2026 00:28:24 +0000 (0:00:00.242) 0:00:14.632 ********** 2026-04-20 00:28:33.580425 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580431 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580437 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580442 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.580448 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580454 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.580459 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.580465 | orchestrator | 2026-04-20 00:28:33.580470 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2026-04-20 00:28:33.580477 | orchestrator | Monday 20 April 2026 00:28:24 +0000 (0:00:00.546) 0:00:15.179 ********** 2026-04-20 00:28:33.580482 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:33.580487 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:28:33.580492 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:28:33.580498 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:28:33.580504 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:28:33.580510 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:28:33.580515 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:28:33.580521 | orchestrator | 2026-04-20 00:28:33.580527 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2026-04-20 00:28:33.580536 | orchestrator | Monday 20 April 2026 00:28:25 +0000 (0:00:00.259) 0:00:15.438 ********** 2026-04-20 00:28:33.580542 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580549 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:33.580556 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:33.580562 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:28:33.580566 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:28:33.580571 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:33.580575 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:28:33.580579 | orchestrator | 2026-04-20 00:28:33.580584 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2026-04-20 00:28:33.580588 | orchestrator | Monday 20 April 2026 00:28:25 +0000 (0:00:00.585) 0:00:16.023 ********** 2026-04-20 00:28:33.580593 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580603 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:33.580608 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:33.580612 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:28:33.580616 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:33.580627 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:28:33.580632 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:28:33.580642 | orchestrator | 2026-04-20 00:28:33.580645 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2026-04-20 00:28:33.580650 | orchestrator | Monday 20 April 2026 00:28:26 +0000 (0:00:01.216) 0:00:17.239 ********** 2026-04-20 00:28:33.580654 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580657 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.580661 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580665 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.580669 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.580673 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580677 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580680 | orchestrator | 2026-04-20 00:28:33.580684 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2026-04-20 00:28:33.580688 | orchestrator | Monday 20 April 2026 00:28:27 +0000 (0:00:01.128) 0:00:18.368 ********** 2026-04-20 00:28:33.580717 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:28:33.580722 | orchestrator | 2026-04-20 00:28:33.580725 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2026-04-20 00:28:33.580729 | orchestrator | Monday 20 April 2026 00:28:28 +0000 (0:00:00.257) 0:00:18.626 ********** 2026-04-20 00:28:33.580733 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:33.580738 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:28:33.580744 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:33.580751 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:28:33.580757 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:33.580763 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:33.580770 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:28:33.580776 | orchestrator | 2026-04-20 00:28:33.580783 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2026-04-20 00:28:33.580789 | orchestrator | Monday 20 April 2026 00:28:29 +0000 (0:00:01.169) 0:00:19.795 ********** 2026-04-20 00:28:33.580796 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580802 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580809 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580815 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.580822 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.580826 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580829 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.580833 | orchestrator | 2026-04-20 00:28:33.580837 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2026-04-20 00:28:33.580841 | orchestrator | Monday 20 April 2026 00:28:29 +0000 (0:00:00.196) 0:00:19.992 ********** 2026-04-20 00:28:33.580845 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580849 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580853 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580856 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.580860 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.580864 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.580868 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580871 | orchestrator | 2026-04-20 00:28:33.580875 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2026-04-20 00:28:33.580880 | orchestrator | Monday 20 April 2026 00:28:29 +0000 (0:00:00.179) 0:00:20.171 ********** 2026-04-20 00:28:33.580883 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580887 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580891 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580901 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.580907 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.580912 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580918 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.580924 | orchestrator | 2026-04-20 00:28:33.580931 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2026-04-20 00:28:33.580938 | orchestrator | Monday 20 April 2026 00:28:29 +0000 (0:00:00.175) 0:00:20.347 ********** 2026-04-20 00:28:33.580944 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:28:33.580949 | orchestrator | 2026-04-20 00:28:33.580953 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2026-04-20 00:28:33.580957 | orchestrator | Monday 20 April 2026 00:28:30 +0000 (0:00:00.254) 0:00:20.602 ********** 2026-04-20 00:28:33.580961 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.580964 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.580968 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.580989 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.580995 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.581002 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.581009 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.581015 | orchestrator | 2026-04-20 00:28:33.581022 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2026-04-20 00:28:33.581028 | orchestrator | Monday 20 April 2026 00:28:30 +0000 (0:00:00.535) 0:00:21.137 ********** 2026-04-20 00:28:33.581035 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:28:33.581041 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:28:33.581047 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:28:33.581053 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:28:33.581059 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:28:33.581066 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:28:33.581071 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:28:33.581077 | orchestrator | 2026-04-20 00:28:33.581084 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2026-04-20 00:28:33.581090 | orchestrator | Monday 20 April 2026 00:28:30 +0000 (0:00:00.219) 0:00:21.357 ********** 2026-04-20 00:28:33.581095 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.581106 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:33.581113 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:33.581119 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:28:33.581125 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.581130 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.581136 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.581142 | orchestrator | 2026-04-20 00:28:33.581147 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2026-04-20 00:28:33.581154 | orchestrator | Monday 20 April 2026 00:28:31 +0000 (0:00:01.057) 0:00:22.414 ********** 2026-04-20 00:28:33.581165 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.581171 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:28:33.581176 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:28:33.581182 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:28:33.581188 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:28:33.581193 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.581199 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:28:33.581205 | orchestrator | 2026-04-20 00:28:33.581210 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2026-04-20 00:28:33.581217 | orchestrator | Monday 20 April 2026 00:28:32 +0000 (0:00:00.570) 0:00:22.985 ********** 2026-04-20 00:28:33.581223 | orchestrator | ok: [testbed-manager] 2026-04-20 00:28:33.581228 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:28:33.581234 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:28:33.581240 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:28:33.581253 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:29:14.295958 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296075 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296091 | orchestrator | 2026-04-20 00:29:14.296104 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2026-04-20 00:29:14.296116 | orchestrator | Monday 20 April 2026 00:28:33 +0000 (0:00:01.086) 0:00:24.071 ********** 2026-04-20 00:29:14.296126 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.296135 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296145 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296155 | orchestrator | changed: [testbed-manager] 2026-04-20 00:29:14.296166 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:29:14.296175 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:29:14.296185 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:29:14.296194 | orchestrator | 2026-04-20 00:29:14.296204 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2026-04-20 00:29:14.296214 | orchestrator | Monday 20 April 2026 00:28:51 +0000 (0:00:17.877) 0:00:41.949 ********** 2026-04-20 00:29:14.296223 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.296233 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.296243 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.296252 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.296262 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296271 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.296281 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296290 | orchestrator | 2026-04-20 00:29:14.296300 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2026-04-20 00:29:14.296310 | orchestrator | Monday 20 April 2026 00:28:51 +0000 (0:00:00.199) 0:00:42.148 ********** 2026-04-20 00:29:14.296319 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.296329 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.296338 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.296348 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.296357 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296367 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.296377 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296386 | orchestrator | 2026-04-20 00:29:14.296396 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2026-04-20 00:29:14.296406 | orchestrator | Monday 20 April 2026 00:28:51 +0000 (0:00:00.205) 0:00:42.354 ********** 2026-04-20 00:29:14.296416 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.296425 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.296435 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.296444 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.296455 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296466 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.296477 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296488 | orchestrator | 2026-04-20 00:29:14.296499 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2026-04-20 00:29:14.296510 | orchestrator | Monday 20 April 2026 00:28:52 +0000 (0:00:00.195) 0:00:42.550 ********** 2026-04-20 00:29:14.296523 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:29:14.296537 | orchestrator | 2026-04-20 00:29:14.296548 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2026-04-20 00:29:14.296560 | orchestrator | Monday 20 April 2026 00:28:52 +0000 (0:00:00.280) 0:00:42.830 ********** 2026-04-20 00:29:14.296570 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.296582 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.296593 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.296604 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296615 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.296626 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.296636 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296687 | orchestrator | 2026-04-20 00:29:14.296699 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2026-04-20 00:29:14.296710 | orchestrator | Monday 20 April 2026 00:28:54 +0000 (0:00:01.904) 0:00:44.735 ********** 2026-04-20 00:29:14.296720 | orchestrator | changed: [testbed-manager] 2026-04-20 00:29:14.296731 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:29:14.296742 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:29:14.296754 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:29:14.296765 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:29:14.296775 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:29:14.296786 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:29:14.296797 | orchestrator | 2026-04-20 00:29:14.296808 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2026-04-20 00:29:14.296818 | orchestrator | Monday 20 April 2026 00:28:55 +0000 (0:00:00.973) 0:00:45.708 ********** 2026-04-20 00:29:14.296827 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.296837 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.296847 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.296856 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.296866 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.296875 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.296884 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.296894 | orchestrator | 2026-04-20 00:29:14.296903 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2026-04-20 00:29:14.296913 | orchestrator | Monday 20 April 2026 00:28:56 +0000 (0:00:00.872) 0:00:46.581 ********** 2026-04-20 00:29:14.296986 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:29:14.296999 | orchestrator | 2026-04-20 00:29:14.297009 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2026-04-20 00:29:14.297019 | orchestrator | Monday 20 April 2026 00:28:56 +0000 (0:00:00.277) 0:00:46.859 ********** 2026-04-20 00:29:14.297029 | orchestrator | changed: [testbed-manager] 2026-04-20 00:29:14.297039 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:29:14.297048 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:29:14.297058 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:29:14.297067 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:29:14.297077 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:29:14.297086 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:29:14.297096 | orchestrator | 2026-04-20 00:29:14.297121 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2026-04-20 00:29:14.297131 | orchestrator | Monday 20 April 2026 00:28:57 +0000 (0:00:01.059) 0:00:47.919 ********** 2026-04-20 00:29:14.297141 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:29:14.297150 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:29:14.297160 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:29:14.297169 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:29:14.297178 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:29:14.297188 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:29:14.297197 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:29:14.297207 | orchestrator | 2026-04-20 00:29:14.297216 | orchestrator | TASK [osism.services.rsyslog : Include logrotate tasks] ************************ 2026-04-20 00:29:14.297226 | orchestrator | Monday 20 April 2026 00:28:57 +0000 (0:00:00.250) 0:00:48.169 ********** 2026-04-20 00:29:14.297236 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/logrotate.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:29:14.297245 | orchestrator | 2026-04-20 00:29:14.297255 | orchestrator | TASK [osism.services.rsyslog : Ensure logrotate package is installed] ********** 2026-04-20 00:29:14.297264 | orchestrator | Monday 20 April 2026 00:28:58 +0000 (0:00:00.293) 0:00:48.462 ********** 2026-04-20 00:29:14.297283 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.297293 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.297302 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.297312 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.297321 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.297331 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.297340 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.297349 | orchestrator | 2026-04-20 00:29:14.297359 | orchestrator | TASK [osism.services.rsyslog : Configure logrotate for rsyslog] **************** 2026-04-20 00:29:14.297368 | orchestrator | Monday 20 April 2026 00:29:00 +0000 (0:00:02.029) 0:00:50.492 ********** 2026-04-20 00:29:14.297378 | orchestrator | changed: [testbed-manager] 2026-04-20 00:29:14.297388 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:29:14.297397 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:29:14.297407 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:29:14.297416 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:29:14.297426 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:29:14.297435 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:29:14.297445 | orchestrator | 2026-04-20 00:29:14.297454 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2026-04-20 00:29:14.297464 | orchestrator | Monday 20 April 2026 00:29:01 +0000 (0:00:01.216) 0:00:51.708 ********** 2026-04-20 00:29:14.297473 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:29:14.297483 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:29:14.297492 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:29:14.297502 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:29:14.297511 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:29:14.297521 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:29:14.297530 | orchestrator | changed: [testbed-manager] 2026-04-20 00:29:14.297540 | orchestrator | 2026-04-20 00:29:14.297549 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2026-04-20 00:29:14.297559 | orchestrator | Monday 20 April 2026 00:29:11 +0000 (0:00:10.645) 0:01:02.353 ********** 2026-04-20 00:29:14.297568 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.297578 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.297587 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.297597 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.297606 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.297616 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.297625 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.297635 | orchestrator | 2026-04-20 00:29:14.297645 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2026-04-20 00:29:14.297654 | orchestrator | Monday 20 April 2026 00:29:12 +0000 (0:00:00.722) 0:01:03.076 ********** 2026-04-20 00:29:14.297664 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.297673 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.297683 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.297692 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.297702 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.297744 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.297754 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.297764 | orchestrator | 2026-04-20 00:29:14.297773 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2026-04-20 00:29:14.297783 | orchestrator | Monday 20 April 2026 00:29:13 +0000 (0:00:00.894) 0:01:03.970 ********** 2026-04-20 00:29:14.297793 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.297802 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.297812 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.297821 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.297831 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.297841 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.297850 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.297860 | orchestrator | 2026-04-20 00:29:14.297869 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2026-04-20 00:29:14.297879 | orchestrator | Monday 20 April 2026 00:29:13 +0000 (0:00:00.223) 0:01:04.194 ********** 2026-04-20 00:29:14.297896 | orchestrator | ok: [testbed-manager] 2026-04-20 00:29:14.297906 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:29:14.297915 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:29:14.297925 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:29:14.297952 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:29:14.297963 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:29:14.297972 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:29:14.297982 | orchestrator | 2026-04-20 00:29:14.297992 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2026-04-20 00:29:14.298001 | orchestrator | Monday 20 April 2026 00:29:13 +0000 (0:00:00.227) 0:01:04.422 ********** 2026-04-20 00:29:14.298011 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:29:14.298068 | orchestrator | 2026-04-20 00:29:14.298086 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2026-04-20 00:31:31.669361 | orchestrator | Monday 20 April 2026 00:29:14 +0000 (0:00:00.296) 0:01:04.718 ********** 2026-04-20 00:31:31.669467 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:31.669478 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.669485 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.669492 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.669498 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.669504 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.669512 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.669518 | orchestrator | 2026-04-20 00:31:31.669526 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2026-04-20 00:31:31.669532 | orchestrator | Monday 20 April 2026 00:29:16 +0000 (0:00:02.033) 0:01:06.751 ********** 2026-04-20 00:31:31.669539 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:31.669547 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:31.669553 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:31.669560 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:31.669567 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:31.669574 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:31.669580 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:31.669586 | orchestrator | 2026-04-20 00:31:31.669610 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2026-04-20 00:31:31.669619 | orchestrator | Monday 20 April 2026 00:29:17 +0000 (0:00:00.700) 0:01:07.452 ********** 2026-04-20 00:31:31.669625 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:31.669633 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.669639 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.669645 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.669652 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.669659 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.669665 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.669672 | orchestrator | 2026-04-20 00:31:31.669678 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2026-04-20 00:31:31.669685 | orchestrator | Monday 20 April 2026 00:29:17 +0000 (0:00:00.288) 0:01:07.741 ********** 2026-04-20 00:31:31.669692 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:31.669698 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.669704 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.669711 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.669717 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.669723 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.669730 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.669737 | orchestrator | 2026-04-20 00:31:31.669743 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2026-04-20 00:31:31.669750 | orchestrator | Monday 20 April 2026 00:29:18 +0000 (0:00:01.430) 0:01:09.171 ********** 2026-04-20 00:31:31.669757 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:31.669767 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:31.669855 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:31.669864 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:31.669871 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:31.669877 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:31.669884 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:31.669891 | orchestrator | 2026-04-20 00:31:31.669898 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2026-04-20 00:31:31.669906 | orchestrator | Monday 20 April 2026 00:29:21 +0000 (0:00:02.468) 0:01:11.640 ********** 2026-04-20 00:31:31.669912 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:31.669919 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.669926 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.669933 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.669940 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.669947 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.669953 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.669960 | orchestrator | 2026-04-20 00:31:31.669967 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2026-04-20 00:31:31.669974 | orchestrator | Monday 20 April 2026 00:29:24 +0000 (0:00:02.879) 0:01:14.519 ********** 2026-04-20 00:31:31.669981 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:31.669988 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.669995 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.670002 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.670009 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.670063 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.670071 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.670078 | orchestrator | 2026-04-20 00:31:31.670084 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2026-04-20 00:31:31.670091 | orchestrator | Monday 20 April 2026 00:30:02 +0000 (0:00:38.065) 0:01:52.585 ********** 2026-04-20 00:31:31.670097 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:31.670103 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:31.670110 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:31.670116 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:31.670123 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:31.670129 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:31.670136 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:31.670142 | orchestrator | 2026-04-20 00:31:31.670148 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2026-04-20 00:31:31.670155 | orchestrator | Monday 20 April 2026 00:31:18 +0000 (0:01:16.070) 0:03:08.655 ********** 2026-04-20 00:31:31.670162 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:31.670168 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.670175 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.670181 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.670188 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.670194 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.670201 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.670207 | orchestrator | 2026-04-20 00:31:31.670219 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2026-04-20 00:31:31.670227 | orchestrator | Monday 20 April 2026 00:31:20 +0000 (0:00:02.001) 0:03:10.657 ********** 2026-04-20 00:31:31.670233 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:31.670239 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:31.670245 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:31.670252 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:31.670258 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:31.670264 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:31.670270 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:31.670277 | orchestrator | 2026-04-20 00:31:31.670283 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2026-04-20 00:31:31.670289 | orchestrator | Monday 20 April 2026 00:31:30 +0000 (0:00:10.358) 0:03:21.015 ********** 2026-04-20 00:31:31.670320 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2026-04-20 00:31:31.670344 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2026-04-20 00:31:31.670353 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2026-04-20 00:31:31.670361 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2026-04-20 00:31:31.670368 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'network', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2026-04-20 00:31:31.670375 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2026-04-20 00:31:31.670381 | orchestrator | 2026-04-20 00:31:31.670388 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2026-04-20 00:31:31.670394 | orchestrator | Monday 20 April 2026 00:31:30 +0000 (0:00:00.396) 0:03:21.412 ********** 2026-04-20 00:31:31.670400 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-20 00:31:31.670407 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:31.670413 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-20 00:31:31.670420 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:31:31.670426 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-20 00:31:31.670432 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:31:31.670438 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2026-04-20 00:31:31.670445 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:31:31.670452 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-20 00:31:31.670458 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-20 00:31:31.670465 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-20 00:31:31.670471 | orchestrator | 2026-04-20 00:31:31.670478 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2026-04-20 00:31:31.670489 | orchestrator | Monday 20 April 2026 00:31:31 +0000 (0:00:00.609) 0:03:22.022 ********** 2026-04-20 00:31:31.670500 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-20 00:31:31.670508 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-20 00:31:31.670515 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-20 00:31:31.670521 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-20 00:31:31.670527 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-20 00:31:31.670539 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-20 00:31:37.527990 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-20 00:31:37.528128 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-20 00:31:37.528155 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-20 00:31:37.528174 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-20 00:31:37.528196 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:37.528217 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-20 00:31:37.528235 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-20 00:31:37.528253 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-20 00:31:37.528264 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-20 00:31:37.528281 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-20 00:31:37.528310 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-20 00:31:37.528330 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-20 00:31:37.528348 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-20 00:31:37.528366 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-20 00:31:37.528382 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-20 00:31:37.528398 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-20 00:31:37.528416 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-20 00:31:37.528433 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:31:37.528451 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-20 00:31:37.528470 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-20 00:31:37.528489 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-20 00:31:37.528509 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-20 00:31:37.528527 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-20 00:31:37.528546 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-20 00:31:37.528564 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2026-04-20 00:31:37.528584 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-20 00:31:37.528643 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2026-04-20 00:31:37.528657 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-20 00:31:37.528671 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:31:37.528684 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2026-04-20 00:31:37.528696 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2026-04-20 00:31:37.528709 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2026-04-20 00:31:37.528727 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2026-04-20 00:31:37.528746 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2026-04-20 00:31:37.528764 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2026-04-20 00:31:37.528810 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2026-04-20 00:31:37.528829 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2026-04-20 00:31:37.528846 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:31:37.528863 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-04-20 00:31:37.528880 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-04-20 00:31:37.528898 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-04-20 00:31:37.528916 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-04-20 00:31:37.528933 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2026-04-20 00:31:37.528978 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-04-20 00:31:37.528998 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-04-20 00:31:37.529017 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2026-04-20 00:31:37.529037 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-04-20 00:31:37.529055 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-04-20 00:31:37.529072 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2026-04-20 00:31:37.529083 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-04-20 00:31:37.529094 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-04-20 00:31:37.529105 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-04-20 00:31:37.529116 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2026-04-20 00:31:37.529127 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-04-20 00:31:37.529137 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-04-20 00:31:37.529148 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2026-04-20 00:31:37.529159 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-04-20 00:31:37.529169 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-04-20 00:31:37.529180 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2026-04-20 00:31:37.529191 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-04-20 00:31:37.529214 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-04-20 00:31:37.529225 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2026-04-20 00:31:37.529236 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-04-20 00:31:37.529247 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2026-04-20 00:31:37.529257 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-04-20 00:31:37.529268 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2026-04-20 00:31:37.529278 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-04-20 00:31:37.529289 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2026-04-20 00:31:37.529301 | orchestrator | 2026-04-20 00:31:37.529312 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2026-04-20 00:31:37.529323 | orchestrator | Monday 20 April 2026 00:31:36 +0000 (0:00:04.731) 0:03:26.753 ********** 2026-04-20 00:31:37.529334 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529345 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529356 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529366 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529377 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529409 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529421 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2026-04-20 00:31:37.529432 | orchestrator | 2026-04-20 00:31:37.529443 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2026-04-20 00:31:37.529453 | orchestrator | Monday 20 April 2026 00:31:36 +0000 (0:00:00.591) 0:03:27.345 ********** 2026-04-20 00:31:37.529464 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:37.529475 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:37.529491 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:37.529502 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:37.529513 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:31:37.529523 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:37.529534 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:31:37.529545 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:31:37.529555 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-20 00:31:37.529566 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-20 00:31:37.529586 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-20 00:31:50.973481 | orchestrator | 2026-04-20 00:31:50.973601 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on network] ***************** 2026-04-20 00:31:50.973623 | orchestrator | Monday 20 April 2026 00:31:37 +0000 (0:00:00.641) 0:03:27.987 ********** 2026-04-20 00:31:50.973637 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:50.973652 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:50.973698 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:50.973713 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:31:50.973728 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:50.973743 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:31:50.973757 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2026-04-20 00:31:50.973796 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:31:50.973806 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-20 00:31:50.973815 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-20 00:31:50.973823 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2026-04-20 00:31:50.973832 | orchestrator | 2026-04-20 00:31:50.973841 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2026-04-20 00:31:50.973850 | orchestrator | Monday 20 April 2026 00:31:38 +0000 (0:00:00.560) 0:03:28.547 ********** 2026-04-20 00:31:50.973859 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-20 00:31:50.973868 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:50.973876 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-20 00:31:50.973885 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-20 00:31:50.973894 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:31:50.973903 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:31:50.973912 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2026-04-20 00:31:50.973920 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:31:50.973929 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-04-20 00:31:50.973940 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-04-20 00:31:50.973950 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2026-04-20 00:31:50.973959 | orchestrator | 2026-04-20 00:31:50.973969 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2026-04-20 00:31:50.973979 | orchestrator | Monday 20 April 2026 00:31:39 +0000 (0:00:01.674) 0:03:30.222 ********** 2026-04-20 00:31:50.973989 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:50.973998 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:31:50.974008 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:31:50.974086 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:31:50.974107 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:31:50.974125 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:31:50.974139 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:31:50.974154 | orchestrator | 2026-04-20 00:31:50.974167 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2026-04-20 00:31:50.974183 | orchestrator | Monday 20 April 2026 00:31:40 +0000 (0:00:00.301) 0:03:30.523 ********** 2026-04-20 00:31:50.974198 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:50.974214 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:50.974228 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:50.974244 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:50.974260 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:50.974275 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:50.974290 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:50.974305 | orchestrator | 2026-04-20 00:31:50.974321 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2026-04-20 00:31:50.974336 | orchestrator | Monday 20 April 2026 00:31:45 +0000 (0:00:05.360) 0:03:35.884 ********** 2026-04-20 00:31:50.974367 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2026-04-20 00:31:50.974382 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2026-04-20 00:31:50.974398 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:50.974413 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2026-04-20 00:31:50.974429 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:31:50.974462 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2026-04-20 00:31:50.974480 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:31:50.974503 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2026-04-20 00:31:50.974516 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:31:50.974529 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:31:50.974541 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2026-04-20 00:31:50.974554 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:31:50.974568 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2026-04-20 00:31:50.974580 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:31:50.974593 | orchestrator | 2026-04-20 00:31:50.974607 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2026-04-20 00:31:50.974621 | orchestrator | Monday 20 April 2026 00:31:45 +0000 (0:00:00.277) 0:03:36.162 ********** 2026-04-20 00:31:50.974636 | orchestrator | ok: [testbed-manager] => (item=cron) 2026-04-20 00:31:50.974651 | orchestrator | ok: [testbed-node-0] => (item=cron) 2026-04-20 00:31:50.974666 | orchestrator | ok: [testbed-node-1] => (item=cron) 2026-04-20 00:31:50.974698 | orchestrator | ok: [testbed-node-4] => (item=cron) 2026-04-20 00:31:50.974707 | orchestrator | ok: [testbed-node-2] => (item=cron) 2026-04-20 00:31:50.974715 | orchestrator | ok: [testbed-node-3] => (item=cron) 2026-04-20 00:31:50.974724 | orchestrator | ok: [testbed-node-5] => (item=cron) 2026-04-20 00:31:50.974732 | orchestrator | 2026-04-20 00:31:50.974740 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2026-04-20 00:31:50.974749 | orchestrator | Monday 20 April 2026 00:31:46 +0000 (0:00:00.997) 0:03:37.160 ********** 2026-04-20 00:31:50.974784 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:31:50.974798 | orchestrator | 2026-04-20 00:31:50.974806 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2026-04-20 00:31:50.974815 | orchestrator | Monday 20 April 2026 00:31:47 +0000 (0:00:00.402) 0:03:37.563 ********** 2026-04-20 00:31:50.974823 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:50.974832 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:50.974840 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:50.974849 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:50.974857 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:50.974866 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:50.974874 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:50.974883 | orchestrator | 2026-04-20 00:31:50.974891 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2026-04-20 00:31:50.974900 | orchestrator | Monday 20 April 2026 00:31:48 +0000 (0:00:01.317) 0:03:38.881 ********** 2026-04-20 00:31:50.974908 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:50.974917 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:50.974925 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:50.974933 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:50.974942 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:50.974950 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:50.974959 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:50.974967 | orchestrator | 2026-04-20 00:31:50.974976 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2026-04-20 00:31:50.974984 | orchestrator | Monday 20 April 2026 00:31:49 +0000 (0:00:00.588) 0:03:39.469 ********** 2026-04-20 00:31:50.974993 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:50.975001 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:50.975020 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:50.975029 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:50.975038 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:50.975046 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:50.975054 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:50.975063 | orchestrator | 2026-04-20 00:31:50.975071 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2026-04-20 00:31:50.975080 | orchestrator | Monday 20 April 2026 00:31:49 +0000 (0:00:00.760) 0:03:40.229 ********** 2026-04-20 00:31:50.975088 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:50.975097 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:50.975105 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:50.975114 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:50.975122 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:50.975130 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:50.975139 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:50.975147 | orchestrator | 2026-04-20 00:31:50.975156 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2026-04-20 00:31:50.975164 | orchestrator | Monday 20 April 2026 00:31:50 +0000 (0:00:00.612) 0:03:40.842 ********** 2026-04-20 00:31:50.975178 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643534.3662808, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:50.975197 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643548.2088113, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:50.975207 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643539.3014445, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:50.975237 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643565.666347, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415569 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643553.5039802, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415725 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643549.0782526, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415743 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1776643560.2762222, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415846 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415870 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415890 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415911 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415957 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.415993 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.416012 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2026-04-20 00:31:56.416034 | orchestrator | 2026-04-20 00:31:56.416055 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2026-04-20 00:31:56.416068 | orchestrator | Monday 20 April 2026 00:31:51 +0000 (0:00:01.053) 0:03:41.895 ********** 2026-04-20 00:31:56.416081 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:56.416096 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:56.416108 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:56.416120 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:56.416132 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:56.416144 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:56.416156 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:56.416168 | orchestrator | 2026-04-20 00:31:56.416180 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2026-04-20 00:31:56.416194 | orchestrator | Monday 20 April 2026 00:31:52 +0000 (0:00:01.097) 0:03:42.993 ********** 2026-04-20 00:31:56.416206 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:56.416218 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:56.416230 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:56.416242 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:56.416254 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:56.416267 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:56.416279 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:56.416290 | orchestrator | 2026-04-20 00:31:56.416321 | orchestrator | TASK [osism.commons.motd : Copy issue.net file] ******************************** 2026-04-20 00:31:56.416334 | orchestrator | Monday 20 April 2026 00:31:53 +0000 (0:00:01.241) 0:03:44.234 ********** 2026-04-20 00:31:56.416347 | orchestrator | changed: [testbed-manager] 2026-04-20 00:31:56.416359 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:31:56.416371 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:31:56.416382 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:31:56.416394 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:31:56.416406 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:31:56.416419 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:31:56.416431 | orchestrator | 2026-04-20 00:31:56.416443 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2026-04-20 00:31:56.416453 | orchestrator | Monday 20 April 2026 00:31:55 +0000 (0:00:01.213) 0:03:45.447 ********** 2026-04-20 00:31:56.416469 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:31:56.416480 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:31:56.416490 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:31:56.416500 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:31:56.416511 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:31:56.416521 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:31:56.416539 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:31:56.416550 | orchestrator | 2026-04-20 00:31:56.416560 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2026-04-20 00:31:56.416571 | orchestrator | Monday 20 April 2026 00:31:55 +0000 (0:00:00.255) 0:03:45.703 ********** 2026-04-20 00:31:56.416581 | orchestrator | ok: [testbed-manager] 2026-04-20 00:31:56.416592 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:31:56.416603 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:31:56.416622 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:31:56.416640 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:31:56.416658 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:31:56.416675 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:31:56.416691 | orchestrator | 2026-04-20 00:31:56.416709 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2026-04-20 00:31:56.416728 | orchestrator | Monday 20 April 2026 00:31:56 +0000 (0:00:00.779) 0:03:46.482 ********** 2026-04-20 00:31:56.416750 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:31:56.416800 | orchestrator | 2026-04-20 00:31:56.416819 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2026-04-20 00:31:56.416853 | orchestrator | Monday 20 April 2026 00:31:56 +0000 (0:00:00.354) 0:03:46.837 ********** 2026-04-20 00:33:14.999962 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000092 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:15.000111 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:15.000123 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:15.000134 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:15.000145 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:15.000156 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:15.000168 | orchestrator | 2026-04-20 00:33:15.000180 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2026-04-20 00:33:15.000193 | orchestrator | Monday 20 April 2026 00:32:05 +0000 (0:00:09.222) 0:03:56.059 ********** 2026-04-20 00:33:15.000204 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000214 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.000225 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.000236 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.000246 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.000257 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.000267 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.000278 | orchestrator | 2026-04-20 00:33:15.000289 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2026-04-20 00:33:15.000300 | orchestrator | Monday 20 April 2026 00:32:06 +0000 (0:00:01.339) 0:03:57.399 ********** 2026-04-20 00:33:15.000310 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000321 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.000332 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.000342 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.000353 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.000363 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.000374 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.000385 | orchestrator | 2026-04-20 00:33:15.000396 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2026-04-20 00:33:15.000407 | orchestrator | Monday 20 April 2026 00:32:07 +0000 (0:00:00.951) 0:03:58.350 ********** 2026-04-20 00:33:15.000418 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000429 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.000439 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.000450 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.000461 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.000474 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.000486 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.000498 | orchestrator | 2026-04-20 00:33:15.000538 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2026-04-20 00:33:15.000555 | orchestrator | Monday 20 April 2026 00:32:08 +0000 (0:00:00.276) 0:03:58.627 ********** 2026-04-20 00:33:15.000574 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000593 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.000613 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.000663 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.000685 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.000702 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.000715 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.000727 | orchestrator | 2026-04-20 00:33:15.000740 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2026-04-20 00:33:15.000752 | orchestrator | Monday 20 April 2026 00:32:08 +0000 (0:00:00.274) 0:03:58.901 ********** 2026-04-20 00:33:15.000765 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000780 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.000793 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.000805 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.000818 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.000829 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.000839 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.000850 | orchestrator | 2026-04-20 00:33:15.000861 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2026-04-20 00:33:15.000885 | orchestrator | Monday 20 April 2026 00:32:08 +0000 (0:00:00.303) 0:03:59.205 ********** 2026-04-20 00:33:15.000896 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.000907 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.000917 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.000928 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.000938 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.000949 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.000959 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.000970 | orchestrator | 2026-04-20 00:33:15.000981 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2026-04-20 00:33:15.000991 | orchestrator | Monday 20 April 2026 00:32:13 +0000 (0:00:05.078) 0:04:04.283 ********** 2026-04-20 00:33:15.001021 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:33:15.001035 | orchestrator | 2026-04-20 00:33:15.001047 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2026-04-20 00:33:15.001057 | orchestrator | Monday 20 April 2026 00:32:14 +0000 (0:00:00.377) 0:04:04.661 ********** 2026-04-20 00:33:15.001068 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001078 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2026-04-20 00:33:15.001089 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001100 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2026-04-20 00:33:15.001111 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:15.001122 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001132 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:15.001143 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2026-04-20 00:33:15.001154 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001164 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:15.001175 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2026-04-20 00:33:15.001185 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:15.001196 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001206 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2026-04-20 00:33:15.001217 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001237 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:15.001268 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2026-04-20 00:33:15.001280 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:15.001291 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2026-04-20 00:33:15.001301 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2026-04-20 00:33:15.001312 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:15.001322 | orchestrator | 2026-04-20 00:33:15.001333 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2026-04-20 00:33:15.001343 | orchestrator | Monday 20 April 2026 00:32:14 +0000 (0:00:00.328) 0:04:04.989 ********** 2026-04-20 00:33:15.001355 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:33:15.001366 | orchestrator | 2026-04-20 00:33:15.001376 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2026-04-20 00:33:15.001387 | orchestrator | Monday 20 April 2026 00:32:15 +0000 (0:00:00.500) 0:04:05.489 ********** 2026-04-20 00:33:15.001398 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2026-04-20 00:33:15.001408 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:15.001419 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2026-04-20 00:33:15.001430 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2026-04-20 00:33:15.001440 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:15.001451 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2026-04-20 00:33:15.001462 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:15.001472 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2026-04-20 00:33:15.001483 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:15.001494 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2026-04-20 00:33:15.001504 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:15.001515 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:15.001526 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2026-04-20 00:33:15.001536 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:15.001547 | orchestrator | 2026-04-20 00:33:15.001557 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2026-04-20 00:33:15.001568 | orchestrator | Monday 20 April 2026 00:32:15 +0000 (0:00:00.297) 0:04:05.787 ********** 2026-04-20 00:33:15.001579 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:33:15.001590 | orchestrator | 2026-04-20 00:33:15.001600 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2026-04-20 00:33:15.001611 | orchestrator | Monday 20 April 2026 00:32:15 +0000 (0:00:00.376) 0:04:06.164 ********** 2026-04-20 00:33:15.001678 | orchestrator | changed: [testbed-manager] 2026-04-20 00:33:15.001692 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:15.001702 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:15.001713 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:15.001724 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:15.001734 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:15.001745 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:15.001755 | orchestrator | 2026-04-20 00:33:15.001766 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2026-04-20 00:33:15.001776 | orchestrator | Monday 20 April 2026 00:32:50 +0000 (0:00:34.657) 0:04:40.821 ********** 2026-04-20 00:33:15.001787 | orchestrator | changed: [testbed-manager] 2026-04-20 00:33:15.001797 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:15.001808 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:15.001827 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:15.001837 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:15.001848 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:15.001858 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:15.001869 | orchestrator | 2026-04-20 00:33:15.001885 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2026-04-20 00:33:15.001896 | orchestrator | Monday 20 April 2026 00:32:58 +0000 (0:00:08.518) 0:04:49.340 ********** 2026-04-20 00:33:15.001907 | orchestrator | changed: [testbed-manager] 2026-04-20 00:33:15.001917 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:15.001928 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:15.001938 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:15.001949 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:15.001959 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:15.001969 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:15.001980 | orchestrator | 2026-04-20 00:33:15.001991 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2026-04-20 00:33:15.002001 | orchestrator | Monday 20 April 2026 00:33:07 +0000 (0:00:08.395) 0:04:57.735 ********** 2026-04-20 00:33:15.002012 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:15.002117 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:15.002129 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:15.002160 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:15.002171 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:15.002182 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:15.002192 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:15.002202 | orchestrator | 2026-04-20 00:33:15.002213 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2026-04-20 00:33:15.002224 | orchestrator | Monday 20 April 2026 00:33:08 +0000 (0:00:01.680) 0:04:59.415 ********** 2026-04-20 00:33:15.002234 | orchestrator | changed: [testbed-manager] 2026-04-20 00:33:15.002245 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:15.002256 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:15.002266 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:15.002277 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:15.002287 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:15.002298 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:15.002308 | orchestrator | 2026-04-20 00:33:15.002329 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2026-04-20 00:33:25.870529 | orchestrator | Monday 20 April 2026 00:33:14 +0000 (0:00:06.006) 0:05:05.422 ********** 2026-04-20 00:33:25.870700 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:33:25.870731 | orchestrator | 2026-04-20 00:33:25.870755 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2026-04-20 00:33:25.870777 | orchestrator | Monday 20 April 2026 00:33:15 +0000 (0:00:00.386) 0:05:05.808 ********** 2026-04-20 00:33:25.870798 | orchestrator | changed: [testbed-manager] 2026-04-20 00:33:25.870819 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:25.870839 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:25.870861 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:25.870881 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:25.870900 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:25.870920 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:25.870940 | orchestrator | 2026-04-20 00:33:25.870960 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2026-04-20 00:33:25.870980 | orchestrator | Monday 20 April 2026 00:33:16 +0000 (0:00:00.730) 0:05:06.539 ********** 2026-04-20 00:33:25.870999 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:25.871019 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:25.871037 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:25.871058 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:25.871117 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:25.871138 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:25.871158 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:25.871180 | orchestrator | 2026-04-20 00:33:25.871202 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2026-04-20 00:33:25.871223 | orchestrator | Monday 20 April 2026 00:33:17 +0000 (0:00:01.738) 0:05:08.278 ********** 2026-04-20 00:33:25.871242 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:33:25.871263 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:33:25.871284 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:33:25.871305 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:33:25.871327 | orchestrator | changed: [testbed-manager] 2026-04-20 00:33:25.871348 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:33:25.871369 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:33:25.871388 | orchestrator | 2026-04-20 00:33:25.871406 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2026-04-20 00:33:25.871424 | orchestrator | Monday 20 April 2026 00:33:18 +0000 (0:00:00.782) 0:05:09.060 ********** 2026-04-20 00:33:25.871442 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:25.871460 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:25.871479 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:25.871497 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:25.871515 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:25.871534 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:25.871552 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:25.871567 | orchestrator | 2026-04-20 00:33:25.871578 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2026-04-20 00:33:25.871589 | orchestrator | Monday 20 April 2026 00:33:18 +0000 (0:00:00.259) 0:05:09.319 ********** 2026-04-20 00:33:25.871600 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:25.871646 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:25.871658 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:25.871669 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:25.871679 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:25.871690 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:25.871701 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:25.871711 | orchestrator | 2026-04-20 00:33:25.871723 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2026-04-20 00:33:25.871733 | orchestrator | Monday 20 April 2026 00:33:19 +0000 (0:00:00.368) 0:05:09.688 ********** 2026-04-20 00:33:25.871744 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:25.871754 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:25.871765 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:25.871776 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:25.871786 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:25.871797 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:25.871807 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:25.871826 | orchestrator | 2026-04-20 00:33:25.871865 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2026-04-20 00:33:25.871884 | orchestrator | Monday 20 April 2026 00:33:19 +0000 (0:00:00.437) 0:05:10.126 ********** 2026-04-20 00:33:25.871903 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:25.871923 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:25.871941 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:25.871960 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:25.871971 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:25.871982 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:25.871992 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:25.872003 | orchestrator | 2026-04-20 00:33:25.872014 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2026-04-20 00:33:25.872025 | orchestrator | Monday 20 April 2026 00:33:19 +0000 (0:00:00.240) 0:05:10.366 ********** 2026-04-20 00:33:25.872036 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:25.872047 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:25.872068 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:25.872079 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:25.872090 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:25.872108 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:25.872125 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:25.872143 | orchestrator | 2026-04-20 00:33:25.872160 | orchestrator | TASK [osism.services.docker : Print used docker version] *********************** 2026-04-20 00:33:25.872178 | orchestrator | Monday 20 April 2026 00:33:20 +0000 (0:00:00.284) 0:05:10.651 ********** 2026-04-20 00:33:25.872196 | orchestrator | ok: [testbed-manager] =>  2026-04-20 00:33:25.872215 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872232 | orchestrator | ok: [testbed-node-0] =>  2026-04-20 00:33:25.872251 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872262 | orchestrator | ok: [testbed-node-1] =>  2026-04-20 00:33:25.872273 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872284 | orchestrator | ok: [testbed-node-2] =>  2026-04-20 00:33:25.872294 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872328 | orchestrator | ok: [testbed-node-3] =>  2026-04-20 00:33:25.872339 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872350 | orchestrator | ok: [testbed-node-4] =>  2026-04-20 00:33:25.872360 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872371 | orchestrator | ok: [testbed-node-5] =>  2026-04-20 00:33:25.872382 | orchestrator |  docker_version: 5:27.5.1 2026-04-20 00:33:25.872392 | orchestrator | 2026-04-20 00:33:25.872403 | orchestrator | TASK [osism.services.docker : Print used docker cli version] ******************* 2026-04-20 00:33:25.872414 | orchestrator | Monday 20 April 2026 00:33:20 +0000 (0:00:00.361) 0:05:11.013 ********** 2026-04-20 00:33:25.872425 | orchestrator | ok: [testbed-manager] =>  2026-04-20 00:33:25.872435 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872446 | orchestrator | ok: [testbed-node-0] =>  2026-04-20 00:33:25.872456 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872467 | orchestrator | ok: [testbed-node-1] =>  2026-04-20 00:33:25.872478 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872489 | orchestrator | ok: [testbed-node-2] =>  2026-04-20 00:33:25.872499 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872510 | orchestrator | ok: [testbed-node-3] =>  2026-04-20 00:33:25.872521 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872531 | orchestrator | ok: [testbed-node-4] =>  2026-04-20 00:33:25.872542 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872552 | orchestrator | ok: [testbed-node-5] =>  2026-04-20 00:33:25.872563 | orchestrator |  docker_cli_version: 5:27.5.1 2026-04-20 00:33:25.872580 | orchestrator | 2026-04-20 00:33:25.872599 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2026-04-20 00:33:25.872641 | orchestrator | Monday 20 April 2026 00:33:20 +0000 (0:00:00.262) 0:05:11.275 ********** 2026-04-20 00:33:25.872658 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:25.872677 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:25.872696 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:25.872715 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:25.872733 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:25.872752 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:25.872769 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:25.872784 | orchestrator | 2026-04-20 00:33:25.872795 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2026-04-20 00:33:25.872805 | orchestrator | Monday 20 April 2026 00:33:21 +0000 (0:00:00.250) 0:05:11.526 ********** 2026-04-20 00:33:25.872816 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:25.872827 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:25.872837 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:25.872848 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:33:25.872858 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:33:25.872869 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:33:25.872879 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:33:25.872901 | orchestrator | 2026-04-20 00:33:25.872912 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2026-04-20 00:33:25.872923 | orchestrator | Monday 20 April 2026 00:33:21 +0000 (0:00:00.261) 0:05:11.787 ********** 2026-04-20 00:33:25.872935 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:33:25.872948 | orchestrator | 2026-04-20 00:33:25.872959 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2026-04-20 00:33:25.872970 | orchestrator | Monday 20 April 2026 00:33:21 +0000 (0:00:00.409) 0:05:12.197 ********** 2026-04-20 00:33:25.872980 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:25.872991 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:25.873001 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:25.873012 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:25.873022 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:25.873033 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:25.873043 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:25.873054 | orchestrator | 2026-04-20 00:33:25.873064 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2026-04-20 00:33:25.873075 | orchestrator | Monday 20 April 2026 00:33:22 +0000 (0:00:00.802) 0:05:12.999 ********** 2026-04-20 00:33:25.873085 | orchestrator | ok: [testbed-manager] 2026-04-20 00:33:25.873096 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:33:25.873107 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:33:25.873118 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:33:25.873128 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:33:25.873138 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:33:25.873149 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:33:25.873159 | orchestrator | 2026-04-20 00:33:25.873171 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2026-04-20 00:33:25.873182 | orchestrator | Monday 20 April 2026 00:33:25 +0000 (0:00:02.929) 0:05:15.929 ********** 2026-04-20 00:33:25.873193 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2026-04-20 00:33:25.873205 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2026-04-20 00:33:25.873215 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2026-04-20 00:33:25.873226 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2026-04-20 00:33:25.873237 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2026-04-20 00:33:25.873248 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2026-04-20 00:33:25.873258 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:33:25.873269 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2026-04-20 00:33:25.873280 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2026-04-20 00:33:25.873291 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2026-04-20 00:33:25.873301 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:33:25.873312 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2026-04-20 00:33:25.873323 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2026-04-20 00:33:25.873334 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2026-04-20 00:33:25.873344 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:33:25.873355 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2026-04-20 00:33:25.873375 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2026-04-20 00:34:27.768620 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2026-04-20 00:34:27.768723 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:27.768733 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2026-04-20 00:34:27.768785 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2026-04-20 00:34:27.768793 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2026-04-20 00:34:27.768817 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:27.768823 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:27.768830 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2026-04-20 00:34:27.768836 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2026-04-20 00:34:27.768842 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2026-04-20 00:34:27.768848 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:27.768855 | orchestrator | 2026-04-20 00:34:27.768863 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2026-04-20 00:34:27.768871 | orchestrator | Monday 20 April 2026 00:33:26 +0000 (0:00:00.584) 0:05:16.514 ********** 2026-04-20 00:34:27.768877 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.768883 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.768890 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.768896 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.768902 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.768908 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.768914 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.768920 | orchestrator | 2026-04-20 00:34:27.768926 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2026-04-20 00:34:27.768932 | orchestrator | Monday 20 April 2026 00:33:33 +0000 (0:00:07.610) 0:05:24.124 ********** 2026-04-20 00:34:27.768938 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.768944 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.768950 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.768956 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.768963 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.768969 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.768974 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.768981 | orchestrator | 2026-04-20 00:34:27.768987 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2026-04-20 00:34:27.768993 | orchestrator | Monday 20 April 2026 00:33:34 +0000 (0:00:01.027) 0:05:25.152 ********** 2026-04-20 00:34:27.768999 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769005 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769011 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769017 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769023 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769029 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769035 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769041 | orchestrator | 2026-04-20 00:34:27.769047 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2026-04-20 00:34:27.769053 | orchestrator | Monday 20 April 2026 00:33:43 +0000 (0:00:09.253) 0:05:34.405 ********** 2026-04-20 00:34:27.769059 | orchestrator | changed: [testbed-manager] 2026-04-20 00:34:27.769065 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769071 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769077 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769083 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769089 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769095 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769101 | orchestrator | 2026-04-20 00:34:27.769107 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2026-04-20 00:34:27.769114 | orchestrator | Monday 20 April 2026 00:33:47 +0000 (0:00:03.491) 0:05:37.897 ********** 2026-04-20 00:34:27.769120 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769126 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769132 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769138 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769144 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769150 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769156 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769162 | orchestrator | 2026-04-20 00:34:27.769168 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2026-04-20 00:34:27.769182 | orchestrator | Monday 20 April 2026 00:33:48 +0000 (0:00:01.237) 0:05:39.134 ********** 2026-04-20 00:34:27.769189 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769195 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769201 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769207 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769213 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769219 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769225 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769231 | orchestrator | 2026-04-20 00:34:27.769237 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2026-04-20 00:34:27.769243 | orchestrator | Monday 20 April 2026 00:33:50 +0000 (0:00:01.363) 0:05:40.498 ********** 2026-04-20 00:34:27.769249 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:27.769255 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:27.769261 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:27.769267 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:27.769273 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:27.769279 | orchestrator | changed: [testbed-manager] 2026-04-20 00:34:27.769285 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:27.769291 | orchestrator | 2026-04-20 00:34:27.769297 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2026-04-20 00:34:27.769304 | orchestrator | Monday 20 April 2026 00:33:50 +0000 (0:00:00.511) 0:05:41.009 ********** 2026-04-20 00:34:27.769309 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769316 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769322 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769328 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769334 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769340 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769346 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769352 | orchestrator | 2026-04-20 00:34:27.769358 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2026-04-20 00:34:27.769381 | orchestrator | Monday 20 April 2026 00:34:00 +0000 (0:00:09.842) 0:05:50.852 ********** 2026-04-20 00:34:27.769388 | orchestrator | changed: [testbed-manager] 2026-04-20 00:34:27.769394 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769400 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769406 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769412 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769418 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769424 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769430 | orchestrator | 2026-04-20 00:34:27.769436 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2026-04-20 00:34:27.769442 | orchestrator | Monday 20 April 2026 00:34:01 +0000 (0:00:01.053) 0:05:51.905 ********** 2026-04-20 00:34:27.769448 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769455 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769461 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769467 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769473 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769479 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769485 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769574 | orchestrator | 2026-04-20 00:34:27.769588 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2026-04-20 00:34:27.769595 | orchestrator | Monday 20 April 2026 00:34:10 +0000 (0:00:09.508) 0:06:01.413 ********** 2026-04-20 00:34:27.769601 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769607 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769613 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769619 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769625 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769631 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769643 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769649 | orchestrator | 2026-04-20 00:34:27.769655 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2026-04-20 00:34:27.769661 | orchestrator | Monday 20 April 2026 00:34:21 +0000 (0:00:10.721) 0:06:12.135 ********** 2026-04-20 00:34:27.769668 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2026-04-20 00:34:27.769674 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2026-04-20 00:34:27.769680 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2026-04-20 00:34:27.769686 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2026-04-20 00:34:27.769692 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2026-04-20 00:34:27.769698 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2026-04-20 00:34:27.769704 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2026-04-20 00:34:27.769710 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2026-04-20 00:34:27.769716 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2026-04-20 00:34:27.769722 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2026-04-20 00:34:27.769728 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2026-04-20 00:34:27.769734 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2026-04-20 00:34:27.769740 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2026-04-20 00:34:27.769746 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2026-04-20 00:34:27.769752 | orchestrator | 2026-04-20 00:34:27.769758 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2026-04-20 00:34:27.769764 | orchestrator | Monday 20 April 2026 00:34:22 +0000 (0:00:01.170) 0:06:13.306 ********** 2026-04-20 00:34:27.769770 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:27.769776 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:27.769782 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:27.769788 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:27.769794 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:27.769800 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:27.769806 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:27.769813 | orchestrator | 2026-04-20 00:34:27.769819 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2026-04-20 00:34:27.769825 | orchestrator | Monday 20 April 2026 00:34:23 +0000 (0:00:00.533) 0:06:13.839 ********** 2026-04-20 00:34:27.769831 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:27.769837 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:27.769843 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:27.769849 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:27.769855 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:27.769861 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:27.769867 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:27.769873 | orchestrator | 2026-04-20 00:34:27.769880 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2026-04-20 00:34:27.769887 | orchestrator | Monday 20 April 2026 00:34:27 +0000 (0:00:03.692) 0:06:17.532 ********** 2026-04-20 00:34:27.769894 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:27.769900 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:27.769905 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:27.769911 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:27.769917 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:27.769923 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:27.769929 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:27.769935 | orchestrator | 2026-04-20 00:34:27.769942 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2026-04-20 00:34:27.769949 | orchestrator | Monday 20 April 2026 00:34:27 +0000 (0:00:00.410) 0:06:17.942 ********** 2026-04-20 00:34:27.769955 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2026-04-20 00:34:27.769961 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2026-04-20 00:34:27.769971 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:27.769977 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2026-04-20 00:34:27.769983 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2026-04-20 00:34:27.769989 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:27.769995 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2026-04-20 00:34:27.770001 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2026-04-20 00:34:27.770007 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:27.770068 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2026-04-20 00:34:45.366151 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2026-04-20 00:34:45.366263 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:45.366279 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2026-04-20 00:34:45.366291 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2026-04-20 00:34:45.366302 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:45.366313 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2026-04-20 00:34:45.366324 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2026-04-20 00:34:45.366335 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:45.366346 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2026-04-20 00:34:45.366356 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2026-04-20 00:34:45.366367 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:45.366378 | orchestrator | 2026-04-20 00:34:45.366390 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2026-04-20 00:34:45.366402 | orchestrator | Monday 20 April 2026 00:34:27 +0000 (0:00:00.436) 0:06:18.379 ********** 2026-04-20 00:34:45.366412 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:45.366423 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:45.366434 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:45.366444 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:45.366455 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:45.366497 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:45.366508 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:45.366519 | orchestrator | 2026-04-20 00:34:45.366530 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2026-04-20 00:34:45.366541 | orchestrator | Monday 20 April 2026 00:34:28 +0000 (0:00:00.364) 0:06:18.744 ********** 2026-04-20 00:34:45.366552 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:45.366563 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:45.366574 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:45.366587 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:45.366599 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:45.366611 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:45.366623 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:45.366635 | orchestrator | 2026-04-20 00:34:45.366647 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2026-04-20 00:34:45.366659 | orchestrator | Monday 20 April 2026 00:34:28 +0000 (0:00:00.434) 0:06:19.178 ********** 2026-04-20 00:34:45.366671 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:45.366683 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:34:45.366695 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:34:45.366706 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:34:45.366718 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:34:45.366730 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:34:45.366742 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:34:45.366754 | orchestrator | 2026-04-20 00:34:45.366766 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2026-04-20 00:34:45.366778 | orchestrator | Monday 20 April 2026 00:34:29 +0000 (0:00:00.420) 0:06:19.599 ********** 2026-04-20 00:34:45.366824 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.366838 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:34:45.366850 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:34:45.366861 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:34:45.366872 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:34:45.366883 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:34:45.366894 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:34:45.366904 | orchestrator | 2026-04-20 00:34:45.366915 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2026-04-20 00:34:45.366926 | orchestrator | Monday 20 April 2026 00:34:30 +0000 (0:00:01.727) 0:06:21.326 ********** 2026-04-20 00:34:45.366939 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:34:45.366952 | orchestrator | 2026-04-20 00:34:45.366963 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2026-04-20 00:34:45.366987 | orchestrator | Monday 20 April 2026 00:34:31 +0000 (0:00:00.787) 0:06:22.114 ********** 2026-04-20 00:34:45.366999 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.367009 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:45.367020 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:45.367031 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:45.367042 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:45.367052 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:45.367063 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:45.367073 | orchestrator | 2026-04-20 00:34:45.367084 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2026-04-20 00:34:45.367095 | orchestrator | Monday 20 April 2026 00:34:32 +0000 (0:00:00.908) 0:06:23.022 ********** 2026-04-20 00:34:45.367119 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.367130 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:45.367141 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:45.367151 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:45.367162 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:45.367184 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:45.367195 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:45.367206 | orchestrator | 2026-04-20 00:34:45.367216 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2026-04-20 00:34:45.367227 | orchestrator | Monday 20 April 2026 00:34:33 +0000 (0:00:00.781) 0:06:23.804 ********** 2026-04-20 00:34:45.367237 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.367248 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:45.367259 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:45.367269 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:45.367281 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:45.367291 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:45.367302 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:45.367312 | orchestrator | 2026-04-20 00:34:45.367323 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2026-04-20 00:34:45.367354 | orchestrator | Monday 20 April 2026 00:34:34 +0000 (0:00:01.219) 0:06:25.024 ********** 2026-04-20 00:34:45.367366 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:34:45.367377 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:34:45.367387 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:34:45.367398 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:34:45.367408 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:34:45.367419 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:34:45.367429 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:34:45.367440 | orchestrator | 2026-04-20 00:34:45.367451 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2026-04-20 00:34:45.367506 | orchestrator | Monday 20 April 2026 00:34:35 +0000 (0:00:01.283) 0:06:26.308 ********** 2026-04-20 00:34:45.367518 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.367529 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:45.367549 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:45.367560 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:45.367570 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:45.367581 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:45.367591 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:45.367602 | orchestrator | 2026-04-20 00:34:45.367613 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2026-04-20 00:34:45.367624 | orchestrator | Monday 20 April 2026 00:34:37 +0000 (0:00:01.179) 0:06:27.487 ********** 2026-04-20 00:34:45.367634 | orchestrator | changed: [testbed-manager] 2026-04-20 00:34:45.367645 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:34:45.367656 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:34:45.367666 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:34:45.367677 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:34:45.367687 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:34:45.367698 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:34:45.367708 | orchestrator | 2026-04-20 00:34:45.367719 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2026-04-20 00:34:45.367730 | orchestrator | Monday 20 April 2026 00:34:38 +0000 (0:00:01.475) 0:06:28.963 ********** 2026-04-20 00:34:45.367741 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:34:45.367752 | orchestrator | 2026-04-20 00:34:45.367763 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2026-04-20 00:34:45.367774 | orchestrator | Monday 20 April 2026 00:34:39 +0000 (0:00:00.830) 0:06:29.793 ********** 2026-04-20 00:34:45.367784 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.367795 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:34:45.367806 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:34:45.367816 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:34:45.367827 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:34:45.367837 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:34:45.367848 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:34:45.367858 | orchestrator | 2026-04-20 00:34:45.367869 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2026-04-20 00:34:45.367880 | orchestrator | Monday 20 April 2026 00:34:40 +0000 (0:00:01.303) 0:06:31.096 ********** 2026-04-20 00:34:45.367890 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.367901 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:34:45.367911 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:34:45.367922 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:34:45.367933 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:34:45.367943 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:34:45.367954 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:34:45.367965 | orchestrator | 2026-04-20 00:34:45.367975 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2026-04-20 00:34:45.367986 | orchestrator | Monday 20 April 2026 00:34:41 +0000 (0:00:01.285) 0:06:32.382 ********** 2026-04-20 00:34:45.367997 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.368007 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:34:45.368018 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:34:45.368029 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:34:45.368039 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:34:45.368050 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:34:45.368060 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:34:45.368071 | orchestrator | 2026-04-20 00:34:45.368082 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2026-04-20 00:34:45.368098 | orchestrator | Monday 20 April 2026 00:34:43 +0000 (0:00:01.139) 0:06:33.521 ********** 2026-04-20 00:34:45.368109 | orchestrator | ok: [testbed-manager] 2026-04-20 00:34:45.368120 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:34:45.368130 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:34:45.368141 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:34:45.368158 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:34:45.368169 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:34:45.368179 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:34:45.368190 | orchestrator | 2026-04-20 00:34:45.368201 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2026-04-20 00:34:45.368211 | orchestrator | Monday 20 April 2026 00:34:44 +0000 (0:00:01.104) 0:06:34.625 ********** 2026-04-20 00:34:45.368222 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:34:45.368233 | orchestrator | 2026-04-20 00:34:45.368244 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:34:45.368255 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.876) 0:06:35.502 ********** 2026-04-20 00:34:45.368266 | orchestrator | 2026-04-20 00:34:45.368277 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:34:45.368287 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.040) 0:06:35.542 ********** 2026-04-20 00:34:45.368298 | orchestrator | 2026-04-20 00:34:45.368309 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:34:45.368319 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.203) 0:06:35.745 ********** 2026-04-20 00:34:45.368330 | orchestrator | 2026-04-20 00:34:45.368341 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:34:45.368358 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.042) 0:06:35.787 ********** 2026-04-20 00:35:11.283588 | orchestrator | 2026-04-20 00:35:11.283730 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:35:11.283758 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.041) 0:06:35.829 ********** 2026-04-20 00:35:11.283771 | orchestrator | 2026-04-20 00:35:11.283782 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:35:11.283794 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.050) 0:06:35.880 ********** 2026-04-20 00:35:11.283804 | orchestrator | 2026-04-20 00:35:11.283816 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2026-04-20 00:35:11.283826 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.052) 0:06:35.933 ********** 2026-04-20 00:35:11.283837 | orchestrator | 2026-04-20 00:35:11.283848 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2026-04-20 00:35:11.283859 | orchestrator | Monday 20 April 2026 00:34:45 +0000 (0:00:00.042) 0:06:35.975 ********** 2026-04-20 00:35:11.283870 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:11.283882 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:11.283892 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:11.283903 | orchestrator | 2026-04-20 00:35:11.283914 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2026-04-20 00:35:11.283925 | orchestrator | Monday 20 April 2026 00:34:46 +0000 (0:00:01.162) 0:06:37.137 ********** 2026-04-20 00:35:11.283936 | orchestrator | changed: [testbed-manager] 2026-04-20 00:35:11.283947 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:11.283958 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:11.283969 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:11.283980 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:11.283990 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:11.284002 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:11.284015 | orchestrator | 2026-04-20 00:35:11.284027 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart logrotate service] *********** 2026-04-20 00:35:11.284040 | orchestrator | Monday 20 April 2026 00:34:47 +0000 (0:00:01.260) 0:06:38.397 ********** 2026-04-20 00:35:11.284052 | orchestrator | changed: [testbed-manager] 2026-04-20 00:35:11.284065 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:11.284077 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:11.284089 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:11.284129 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:11.284141 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:11.284154 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:11.284166 | orchestrator | 2026-04-20 00:35:11.284178 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2026-04-20 00:35:11.284191 | orchestrator | Monday 20 April 2026 00:34:49 +0000 (0:00:01.076) 0:06:39.474 ********** 2026-04-20 00:35:11.284203 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:11.284215 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:11.284227 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:11.284240 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:11.284252 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:11.284265 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:11.284277 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:11.284290 | orchestrator | 2026-04-20 00:35:11.284302 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2026-04-20 00:35:11.284314 | orchestrator | Monday 20 April 2026 00:34:51 +0000 (0:00:02.353) 0:06:41.828 ********** 2026-04-20 00:35:11.284327 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:11.284339 | orchestrator | 2026-04-20 00:35:11.284351 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2026-04-20 00:35:11.284363 | orchestrator | Monday 20 April 2026 00:34:51 +0000 (0:00:00.091) 0:06:41.919 ********** 2026-04-20 00:35:11.284373 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:11.284384 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:11.284394 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:11.284405 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:11.284435 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:11.284447 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:11.284457 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:11.284468 | orchestrator | 2026-04-20 00:35:11.284479 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2026-04-20 00:35:11.284492 | orchestrator | Monday 20 April 2026 00:34:52 +0000 (0:00:01.144) 0:06:43.063 ********** 2026-04-20 00:35:11.284502 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:11.284513 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:11.284524 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:11.284534 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:11.284545 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:11.284556 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:11.284566 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:11.284577 | orchestrator | 2026-04-20 00:35:11.284587 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2026-04-20 00:35:11.284598 | orchestrator | Monday 20 April 2026 00:34:53 +0000 (0:00:00.512) 0:06:43.576 ********** 2026-04-20 00:35:11.284610 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:35:11.284624 | orchestrator | 2026-04-20 00:35:11.284635 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2026-04-20 00:35:11.284646 | orchestrator | Monday 20 April 2026 00:34:53 +0000 (0:00:00.833) 0:06:44.410 ********** 2026-04-20 00:35:11.284657 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:11.284667 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:11.284678 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:11.284689 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:11.284700 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:11.284710 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:11.284721 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:11.284732 | orchestrator | 2026-04-20 00:35:11.284742 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2026-04-20 00:35:11.284753 | orchestrator | Monday 20 April 2026 00:34:54 +0000 (0:00:01.009) 0:06:45.419 ********** 2026-04-20 00:35:11.284773 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2026-04-20 00:35:11.284803 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2026-04-20 00:35:11.284815 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2026-04-20 00:35:11.284826 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2026-04-20 00:35:11.284837 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2026-04-20 00:35:11.284848 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2026-04-20 00:35:11.284858 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2026-04-20 00:35:11.284886 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2026-04-20 00:35:11.284897 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2026-04-20 00:35:11.284908 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2026-04-20 00:35:11.284919 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2026-04-20 00:35:11.284930 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2026-04-20 00:35:11.284940 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2026-04-20 00:35:11.284951 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2026-04-20 00:35:11.284962 | orchestrator | 2026-04-20 00:35:11.284972 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2026-04-20 00:35:11.284983 | orchestrator | Monday 20 April 2026 00:34:57 +0000 (0:00:02.475) 0:06:47.895 ********** 2026-04-20 00:35:11.284994 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:11.285004 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:11.285015 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:11.285026 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:11.285036 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:11.285047 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:11.285057 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:11.285068 | orchestrator | 2026-04-20 00:35:11.285079 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2026-04-20 00:35:11.285090 | orchestrator | Monday 20 April 2026 00:34:57 +0000 (0:00:00.506) 0:06:48.402 ********** 2026-04-20 00:35:11.285102 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:35:11.285114 | orchestrator | 2026-04-20 00:35:11.285125 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2026-04-20 00:35:11.285136 | orchestrator | Monday 20 April 2026 00:34:58 +0000 (0:00:00.931) 0:06:49.333 ********** 2026-04-20 00:35:11.285147 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:11.285157 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:11.285168 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:11.285178 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:11.285189 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:11.285199 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:11.285210 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:11.285221 | orchestrator | 2026-04-20 00:35:11.285231 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2026-04-20 00:35:11.285242 | orchestrator | Monday 20 April 2026 00:34:59 +0000 (0:00:00.839) 0:06:50.173 ********** 2026-04-20 00:35:11.285253 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:11.285264 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:11.285274 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:11.285285 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:11.285295 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:11.285306 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:11.285316 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:11.285327 | orchestrator | 2026-04-20 00:35:11.285338 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2026-04-20 00:35:11.285356 | orchestrator | Monday 20 April 2026 00:35:00 +0000 (0:00:00.781) 0:06:50.954 ********** 2026-04-20 00:35:11.285367 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:11.285377 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:11.285388 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:11.285399 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:11.285430 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:11.285442 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:11.285453 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:11.285463 | orchestrator | 2026-04-20 00:35:11.285474 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2026-04-20 00:35:11.285485 | orchestrator | Monday 20 April 2026 00:35:00 +0000 (0:00:00.464) 0:06:51.419 ********** 2026-04-20 00:35:11.285496 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:11.285507 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:11.285518 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:11.285528 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:11.285539 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:11.285549 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:11.285560 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:11.285570 | orchestrator | 2026-04-20 00:35:11.285581 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2026-04-20 00:35:11.285592 | orchestrator | Monday 20 April 2026 00:35:02 +0000 (0:00:01.714) 0:06:53.133 ********** 2026-04-20 00:35:11.285602 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:11.285613 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:11.285624 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:11.285635 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:11.285646 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:11.285656 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:11.285667 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:11.285678 | orchestrator | 2026-04-20 00:35:11.285688 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2026-04-20 00:35:11.285699 | orchestrator | Monday 20 April 2026 00:35:03 +0000 (0:00:00.616) 0:06:53.749 ********** 2026-04-20 00:35:11.285710 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:11.285720 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:11.285731 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:11.285742 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:11.285752 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:11.285763 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:11.285780 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:43.102556 | orchestrator | 2026-04-20 00:35:43.102667 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2026-04-20 00:35:43.102683 | orchestrator | Monday 20 April 2026 00:35:11 +0000 (0:00:08.018) 0:07:01.768 ********** 2026-04-20 00:35:43.102693 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.102703 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:43.102714 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:43.102723 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:43.102731 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:43.102740 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:43.102749 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:43.102757 | orchestrator | 2026-04-20 00:35:43.102766 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2026-04-20 00:35:43.102775 | orchestrator | Monday 20 April 2026 00:35:12 +0000 (0:00:01.344) 0:07:03.113 ********** 2026-04-20 00:35:43.102784 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.102793 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:43.102802 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:43.102810 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:43.102818 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:43.102827 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:43.102835 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:43.102868 | orchestrator | 2026-04-20 00:35:43.102877 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2026-04-20 00:35:43.102886 | orchestrator | Monday 20 April 2026 00:35:14 +0000 (0:00:01.657) 0:07:04.770 ********** 2026-04-20 00:35:43.102895 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.102904 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:43.102912 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:43.102921 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:43.102930 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:43.102938 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:43.102946 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:43.102955 | orchestrator | 2026-04-20 00:35:43.102963 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-04-20 00:35:43.102972 | orchestrator | Monday 20 April 2026 00:35:16 +0000 (0:00:01.785) 0:07:06.556 ********** 2026-04-20 00:35:43.102981 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.102989 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.102998 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103006 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103015 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103023 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103032 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103040 | orchestrator | 2026-04-20 00:35:43.103049 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-04-20 00:35:43.103059 | orchestrator | Monday 20 April 2026 00:35:16 +0000 (0:00:00.840) 0:07:07.396 ********** 2026-04-20 00:35:43.103069 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:43.103078 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:43.103088 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:43.103098 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:43.103107 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:43.103117 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:43.103126 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:43.103136 | orchestrator | 2026-04-20 00:35:43.103146 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2026-04-20 00:35:43.103156 | orchestrator | Monday 20 April 2026 00:35:17 +0000 (0:00:00.781) 0:07:08.178 ********** 2026-04-20 00:35:43.103166 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:43.103175 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:43.103186 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:43.103195 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:43.103205 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:43.103215 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:43.103224 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:43.103234 | orchestrator | 2026-04-20 00:35:43.103244 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2026-04-20 00:35:43.103254 | orchestrator | Monday 20 April 2026 00:35:18 +0000 (0:00:00.645) 0:07:08.823 ********** 2026-04-20 00:35:43.103264 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.103273 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.103283 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103309 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103320 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103329 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103340 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103349 | orchestrator | 2026-04-20 00:35:43.103414 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2026-04-20 00:35:43.103427 | orchestrator | Monday 20 April 2026 00:35:18 +0000 (0:00:00.499) 0:07:09.322 ********** 2026-04-20 00:35:43.103437 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.103447 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.103455 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103464 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103472 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103481 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103498 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103507 | orchestrator | 2026-04-20 00:35:43.103516 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2026-04-20 00:35:43.103524 | orchestrator | Monday 20 April 2026 00:35:19 +0000 (0:00:00.484) 0:07:09.806 ********** 2026-04-20 00:35:43.103533 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.103541 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.103550 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103558 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103567 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103575 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103583 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103592 | orchestrator | 2026-04-20 00:35:43.103600 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2026-04-20 00:35:43.103609 | orchestrator | Monday 20 April 2026 00:35:19 +0000 (0:00:00.487) 0:07:10.294 ********** 2026-04-20 00:35:43.103618 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.103626 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.103635 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103643 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103651 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103660 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103668 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103677 | orchestrator | 2026-04-20 00:35:43.103713 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2026-04-20 00:35:43.103723 | orchestrator | Monday 20 April 2026 00:35:25 +0000 (0:00:05.521) 0:07:15.816 ********** 2026-04-20 00:35:43.103732 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:35:43.103741 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:35:43.103749 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:35:43.103758 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:35:43.103766 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:35:43.103775 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:35:43.103783 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:35:43.103792 | orchestrator | 2026-04-20 00:35:43.103801 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2026-04-20 00:35:43.103809 | orchestrator | Monday 20 April 2026 00:35:25 +0000 (0:00:00.544) 0:07:16.360 ********** 2026-04-20 00:35:43.103819 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:35:43.103831 | orchestrator | 2026-04-20 00:35:43.103840 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2026-04-20 00:35:43.103848 | orchestrator | Monday 20 April 2026 00:35:26 +0000 (0:00:00.683) 0:07:17.044 ********** 2026-04-20 00:35:43.103857 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.103865 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.103874 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103882 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103891 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103899 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103907 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103916 | orchestrator | 2026-04-20 00:35:43.103924 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2026-04-20 00:35:43.103933 | orchestrator | Monday 20 April 2026 00:35:28 +0000 (0:00:01.762) 0:07:18.806 ********** 2026-04-20 00:35:43.103941 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.103949 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.103958 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.103966 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.103974 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.103982 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.103990 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.103999 | orchestrator | 2026-04-20 00:35:43.104008 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2026-04-20 00:35:43.104023 | orchestrator | Monday 20 April 2026 00:35:29 +0000 (0:00:01.116) 0:07:19.923 ********** 2026-04-20 00:35:43.104044 | orchestrator | ok: [testbed-manager] 2026-04-20 00:35:43.104053 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:35:43.104070 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:35:43.104079 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:35:43.104087 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:35:43.104095 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:35:43.104104 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:35:43.104112 | orchestrator | 2026-04-20 00:35:43.104120 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2026-04-20 00:35:43.104129 | orchestrator | Monday 20 April 2026 00:35:30 +0000 (0:00:00.796) 0:07:20.720 ********** 2026-04-20 00:35:43.104138 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104148 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104156 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104165 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104179 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104188 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104197 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2026-04-20 00:35:43.104205 | orchestrator | 2026-04-20 00:35:43.104214 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2026-04-20 00:35:43.104222 | orchestrator | Monday 20 April 2026 00:35:31 +0000 (0:00:01.585) 0:07:22.305 ********** 2026-04-20 00:35:43.104231 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:35:43.104240 | orchestrator | 2026-04-20 00:35:43.104249 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2026-04-20 00:35:43.104257 | orchestrator | Monday 20 April 2026 00:35:32 +0000 (0:00:00.910) 0:07:23.215 ********** 2026-04-20 00:35:43.104266 | orchestrator | changed: [testbed-manager] 2026-04-20 00:35:43.104275 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:35:43.104289 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:35:43.104304 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:35:43.104317 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:35:43.104330 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:35:43.104343 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:35:43.104357 | orchestrator | 2026-04-20 00:35:43.104403 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2026-04-20 00:36:12.427153 | orchestrator | Monday 20 April 2026 00:35:43 +0000 (0:00:10.309) 0:07:33.525 ********** 2026-04-20 00:36:12.427278 | orchestrator | ok: [testbed-manager] 2026-04-20 00:36:12.427306 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:36:12.427387 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:36:12.427398 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:36:12.427409 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:36:12.427419 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:36:12.427430 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:36:12.427441 | orchestrator | 2026-04-20 00:36:12.427452 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2026-04-20 00:36:12.427502 | orchestrator | Monday 20 April 2026 00:35:44 +0000 (0:00:01.779) 0:07:35.304 ********** 2026-04-20 00:36:12.427522 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:36:12.427540 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:36:12.427560 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:36:12.427580 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:36:12.427601 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:36:12.427621 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:36:12.427637 | orchestrator | 2026-04-20 00:36:12.427648 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2026-04-20 00:36:12.427658 | orchestrator | Monday 20 April 2026 00:35:46 +0000 (0:00:01.582) 0:07:36.887 ********** 2026-04-20 00:36:12.427669 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.427681 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.427694 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.427707 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.427718 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.427731 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.427743 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.427755 | orchestrator | 2026-04-20 00:36:12.427767 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2026-04-20 00:36:12.427779 | orchestrator | 2026-04-20 00:36:12.427798 | orchestrator | TASK [Include hardening role] ************************************************** 2026-04-20 00:36:12.427818 | orchestrator | Monday 20 April 2026 00:35:47 +0000 (0:00:01.229) 0:07:38.116 ********** 2026-04-20 00:36:12.427837 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:36:12.427854 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:36:12.427873 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:36:12.427893 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:36:12.427914 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:36:12.427933 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:36:12.427951 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:36:12.427963 | orchestrator | 2026-04-20 00:36:12.427976 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2026-04-20 00:36:12.427989 | orchestrator | 2026-04-20 00:36:12.428001 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2026-04-20 00:36:12.428013 | orchestrator | Monday 20 April 2026 00:35:48 +0000 (0:00:00.503) 0:07:38.620 ********** 2026-04-20 00:36:12.428025 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.428037 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.428050 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.428062 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.428073 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.428084 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.428094 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.428105 | orchestrator | 2026-04-20 00:36:12.428116 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2026-04-20 00:36:12.428126 | orchestrator | Monday 20 April 2026 00:35:49 +0000 (0:00:01.289) 0:07:39.909 ********** 2026-04-20 00:36:12.428137 | orchestrator | ok: [testbed-manager] 2026-04-20 00:36:12.428147 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:36:12.428158 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:36:12.428169 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:36:12.428188 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:36:12.428206 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:36:12.428225 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:36:12.428245 | orchestrator | 2026-04-20 00:36:12.428264 | orchestrator | TASK [Include auditd role] ***************************************************** 2026-04-20 00:36:12.428282 | orchestrator | Monday 20 April 2026 00:35:51 +0000 (0:00:01.570) 0:07:41.480 ********** 2026-04-20 00:36:12.428293 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:36:12.428303 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:36:12.428352 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:36:12.428390 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:36:12.428402 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:36:12.428413 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:36:12.428424 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:36:12.428435 | orchestrator | 2026-04-20 00:36:12.428446 | orchestrator | TASK [Include smartd role] ***************************************************** 2026-04-20 00:36:12.428457 | orchestrator | Monday 20 April 2026 00:35:51 +0000 (0:00:00.460) 0:07:41.941 ********** 2026-04-20 00:36:12.428468 | orchestrator | included: osism.services.smartd for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:36:12.428481 | orchestrator | 2026-04-20 00:36:12.428492 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2026-04-20 00:36:12.428503 | orchestrator | Monday 20 April 2026 00:35:52 +0000 (0:00:00.794) 0:07:42.735 ********** 2026-04-20 00:36:12.428515 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:36:12.428534 | orchestrator | 2026-04-20 00:36:12.428553 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2026-04-20 00:36:12.428571 | orchestrator | Monday 20 April 2026 00:35:53 +0000 (0:00:00.942) 0:07:43.678 ********** 2026-04-20 00:36:12.428590 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.428610 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.428630 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.428649 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.428668 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.428679 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.428689 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.428700 | orchestrator | 2026-04-20 00:36:12.428733 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2026-04-20 00:36:12.428753 | orchestrator | Monday 20 April 2026 00:36:01 +0000 (0:00:08.564) 0:07:52.242 ********** 2026-04-20 00:36:12.428772 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.428790 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.428809 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.428828 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.428847 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.428865 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.428880 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.428891 | orchestrator | 2026-04-20 00:36:12.428901 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2026-04-20 00:36:12.428912 | orchestrator | Monday 20 April 2026 00:36:02 +0000 (0:00:00.777) 0:07:53.020 ********** 2026-04-20 00:36:12.428922 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.428933 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.428943 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.428954 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.428964 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.428975 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.428985 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.429001 | orchestrator | 2026-04-20 00:36:12.429053 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2026-04-20 00:36:12.429073 | orchestrator | Monday 20 April 2026 00:36:03 +0000 (0:00:01.231) 0:07:54.251 ********** 2026-04-20 00:36:12.429091 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.429106 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.429117 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.429128 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.429138 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.429149 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.429159 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.429180 | orchestrator | 2026-04-20 00:36:12.429191 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2026-04-20 00:36:12.429202 | orchestrator | Monday 20 April 2026 00:36:05 +0000 (0:00:01.747) 0:07:55.999 ********** 2026-04-20 00:36:12.429219 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.429238 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.429257 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.429277 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.429296 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.429338 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.429357 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.429377 | orchestrator | 2026-04-20 00:36:12.429396 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2026-04-20 00:36:12.429414 | orchestrator | Monday 20 April 2026 00:36:06 +0000 (0:00:01.299) 0:07:57.298 ********** 2026-04-20 00:36:12.429426 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.429436 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.429447 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.429457 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.429468 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.429478 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.429489 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.429499 | orchestrator | 2026-04-20 00:36:12.429510 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2026-04-20 00:36:12.429520 | orchestrator | 2026-04-20 00:36:12.429531 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2026-04-20 00:36:12.429542 | orchestrator | Monday 20 April 2026 00:36:07 +0000 (0:00:01.015) 0:07:58.314 ********** 2026-04-20 00:36:12.429570 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:36:12.429581 | orchestrator | 2026-04-20 00:36:12.429592 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2026-04-20 00:36:12.429602 | orchestrator | Monday 20 April 2026 00:36:08 +0000 (0:00:00.797) 0:07:59.111 ********** 2026-04-20 00:36:12.429613 | orchestrator | ok: [testbed-manager] 2026-04-20 00:36:12.429624 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:36:12.429634 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:36:12.429653 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:36:12.429664 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:36:12.429675 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:36:12.429685 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:36:12.429696 | orchestrator | 2026-04-20 00:36:12.429707 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2026-04-20 00:36:12.429718 | orchestrator | Monday 20 April 2026 00:36:09 +0000 (0:00:00.817) 0:07:59.929 ********** 2026-04-20 00:36:12.429729 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:12.429739 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:12.429750 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:12.429761 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:12.429771 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:12.429782 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:12.429792 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:12.429803 | orchestrator | 2026-04-20 00:36:12.429814 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2026-04-20 00:36:12.429825 | orchestrator | Monday 20 April 2026 00:36:10 +0000 (0:00:01.262) 0:08:01.191 ********** 2026-04-20 00:36:12.429835 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:36:12.429846 | orchestrator | 2026-04-20 00:36:12.429857 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2026-04-20 00:36:12.429868 | orchestrator | Monday 20 April 2026 00:36:11 +0000 (0:00:00.809) 0:08:02.001 ********** 2026-04-20 00:36:12.429887 | orchestrator | ok: [testbed-manager] 2026-04-20 00:36:12.429897 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:36:12.429908 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:36:12.429919 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:36:12.429929 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:36:12.429940 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:36:12.429950 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:36:12.429961 | orchestrator | 2026-04-20 00:36:12.429988 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2026-04-20 00:36:13.904034 | orchestrator | Monday 20 April 2026 00:36:12 +0000 (0:00:00.845) 0:08:02.846 ********** 2026-04-20 00:36:13.904138 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:13.904148 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:13.904153 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:13.904158 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:13.904163 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:13.904167 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:13.904172 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:13.904177 | orchestrator | 2026-04-20 00:36:13.904182 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:36:13.904190 | orchestrator | testbed-manager : ok=168  changed=40  unreachable=0 failed=0 skipped=42  rescued=0 ignored=0 2026-04-20 00:36:13.904199 | orchestrator | testbed-node-0 : ok=177  changed=69  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-20 00:36:13.904206 | orchestrator | testbed-node-1 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-04-20 00:36:13.904232 | orchestrator | testbed-node-2 : ok=177  changed=69  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2026-04-20 00:36:13.905149 | orchestrator | testbed-node-3 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-20 00:36:13.905196 | orchestrator | testbed-node-4 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-20 00:36:13.905205 | orchestrator | testbed-node-5 : ok=175  changed=65  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2026-04-20 00:36:13.905213 | orchestrator | 2026-04-20 00:36:13.905220 | orchestrator | 2026-04-20 00:36:13.905227 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:36:13.905236 | orchestrator | Monday 20 April 2026 00:36:13 +0000 (0:00:01.219) 0:08:04.066 ********** 2026-04-20 00:36:13.905242 | orchestrator | =============================================================================== 2026-04-20 00:36:13.905249 | orchestrator | osism.commons.packages : Install required packages --------------------- 76.07s 2026-04-20 00:36:13.905256 | orchestrator | osism.commons.packages : Download required packages -------------------- 38.07s 2026-04-20 00:36:13.905263 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 34.66s 2026-04-20 00:36:13.905271 | orchestrator | osism.commons.repository : Update package cache ------------------------ 17.88s 2026-04-20 00:36:13.905276 | orchestrator | osism.services.docker : Install docker package ------------------------- 10.72s 2026-04-20 00:36:13.905280 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 10.65s 2026-04-20 00:36:13.905284 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 10.36s 2026-04-20 00:36:13.905289 | orchestrator | osism.services.lldpd : Install lldpd package --------------------------- 10.31s 2026-04-20 00:36:13.905292 | orchestrator | osism.services.docker : Install containerd package ---------------------- 9.84s 2026-04-20 00:36:13.905296 | orchestrator | osism.services.docker : Install docker-cli package ---------------------- 9.51s 2026-04-20 00:36:13.905347 | orchestrator | osism.services.docker : Add repository ---------------------------------- 9.25s 2026-04-20 00:36:13.905352 | orchestrator | osism.services.rng : Install rng package -------------------------------- 9.22s 2026-04-20 00:36:13.905356 | orchestrator | osism.services.smartd : Install smartmontools package ------------------- 8.56s 2026-04-20 00:36:13.905360 | orchestrator | osism.commons.cleanup : Remove cloudinit package ------------------------ 8.52s 2026-04-20 00:36:13.905364 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 8.40s 2026-04-20 00:36:13.905368 | orchestrator | osism.commons.docker_compose : Install docker-compose-plugin package ---- 8.02s 2026-04-20 00:36:13.905372 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 7.61s 2026-04-20 00:36:13.905387 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 6.01s 2026-04-20 00:36:13.905391 | orchestrator | osism.services.chrony : Populate service facts -------------------------- 5.52s 2026-04-20 00:36:13.905395 | orchestrator | osism.commons.services : Populate service facts ------------------------- 5.36s 2026-04-20 00:36:14.078009 | orchestrator | + osism apply fail2ban 2026-04-20 00:36:25.752202 | orchestrator | 2026-04-20 00:36:25 | INFO  | Prepare task for execution of fail2ban. 2026-04-20 00:36:25.832869 | orchestrator | 2026-04-20 00:36:25 | INFO  | Task b91ce92b-a066-42c5-a370-6d540c043e0b (fail2ban) was prepared for execution. 2026-04-20 00:36:25.832993 | orchestrator | 2026-04-20 00:36:25 | INFO  | It takes a moment until task b91ce92b-a066-42c5-a370-6d540c043e0b (fail2ban) has been started and output is visible here. 2026-04-20 00:36:46.467949 | orchestrator | 2026-04-20 00:36:46.468091 | orchestrator | PLAY [Apply role fail2ban] ***************************************************** 2026-04-20 00:36:46.468119 | orchestrator | 2026-04-20 00:36:46.468135 | orchestrator | TASK [osism.services.fail2ban : Include distribution specific install tasks] *** 2026-04-20 00:36:46.468147 | orchestrator | Monday 20 April 2026 00:36:29 +0000 (0:00:00.320) 0:00:00.320 ********** 2026-04-20 00:36:46.468159 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/fail2ban/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:36:46.468174 | orchestrator | 2026-04-20 00:36:46.468193 | orchestrator | TASK [osism.services.fail2ban : Install fail2ban package] ********************** 2026-04-20 00:36:46.468213 | orchestrator | Monday 20 April 2026 00:36:30 +0000 (0:00:01.101) 0:00:01.421 ********** 2026-04-20 00:36:46.468233 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:46.468319 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:46.468340 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:46.468359 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:46.468380 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:46.468400 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:46.468420 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:46.468441 | orchestrator | 2026-04-20 00:36:46.468461 | orchestrator | TASK [osism.services.fail2ban : Copy configuration files] ********************** 2026-04-20 00:36:46.468479 | orchestrator | Monday 20 April 2026 00:36:41 +0000 (0:00:11.519) 0:00:12.941 ********** 2026-04-20 00:36:46.468492 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:46.468505 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:46.468517 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:46.468529 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:46.468542 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:46.468554 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:46.468566 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:46.468578 | orchestrator | 2026-04-20 00:36:46.468590 | orchestrator | TASK [osism.services.fail2ban : Manage fail2ban service] *********************** 2026-04-20 00:36:46.468603 | orchestrator | Monday 20 April 2026 00:36:43 +0000 (0:00:01.587) 0:00:14.529 ********** 2026-04-20 00:36:46.468616 | orchestrator | ok: [testbed-manager] 2026-04-20 00:36:46.468630 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:36:46.468676 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:36:46.468689 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:36:46.468701 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:36:46.468713 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:36:46.468725 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:36:46.468737 | orchestrator | 2026-04-20 00:36:46.468749 | orchestrator | TASK [osism.services.fail2ban : Reload fail2ban configuration] ***************** 2026-04-20 00:36:46.468763 | orchestrator | Monday 20 April 2026 00:36:44 +0000 (0:00:01.194) 0:00:15.723 ********** 2026-04-20 00:36:46.468775 | orchestrator | changed: [testbed-manager] 2026-04-20 00:36:46.468787 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:36:46.468799 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:36:46.468811 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:36:46.468823 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:36:46.468836 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:36:46.468848 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:36:46.468860 | orchestrator | 2026-04-20 00:36:46.468871 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:36:46.468882 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468894 | orchestrator | testbed-node-0 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468905 | orchestrator | testbed-node-1 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468916 | orchestrator | testbed-node-2 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468927 | orchestrator | testbed-node-3 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468952 | orchestrator | testbed-node-4 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468964 | orchestrator | testbed-node-5 : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:36:46.468974 | orchestrator | 2026-04-20 00:36:46.468986 | orchestrator | 2026-04-20 00:36:46.468996 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:36:46.469007 | orchestrator | Monday 20 April 2026 00:36:46 +0000 (0:00:01.613) 0:00:17.337 ********** 2026-04-20 00:36:46.469018 | orchestrator | =============================================================================== 2026-04-20 00:36:46.469029 | orchestrator | osism.services.fail2ban : Install fail2ban package --------------------- 11.52s 2026-04-20 00:36:46.469039 | orchestrator | osism.services.fail2ban : Reload fail2ban configuration ----------------- 1.61s 2026-04-20 00:36:46.469050 | orchestrator | osism.services.fail2ban : Copy configuration files ---------------------- 1.59s 2026-04-20 00:36:46.469061 | orchestrator | osism.services.fail2ban : Manage fail2ban service ----------------------- 1.19s 2026-04-20 00:36:46.469071 | orchestrator | osism.services.fail2ban : Include distribution specific install tasks --- 1.10s 2026-04-20 00:36:46.630584 | orchestrator | + osism apply network 2026-04-20 00:36:58.058978 | orchestrator | 2026-04-20 00:36:58 | INFO  | Prepare task for execution of network. 2026-04-20 00:36:58.122387 | orchestrator | 2026-04-20 00:36:58 | INFO  | Task 60195c22-6209-48da-a6e8-62b499b6f32c (network) was prepared for execution. 2026-04-20 00:36:58.122477 | orchestrator | 2026-04-20 00:36:58 | INFO  | It takes a moment until task 60195c22-6209-48da-a6e8-62b499b6f32c (network) has been started and output is visible here. 2026-04-20 00:37:24.268842 | orchestrator | 2026-04-20 00:37:24.268938 | orchestrator | PLAY [Apply role network] ****************************************************** 2026-04-20 00:37:24.268949 | orchestrator | 2026-04-20 00:37:24.268977 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2026-04-20 00:37:24.268985 | orchestrator | Monday 20 April 2026 00:37:01 +0000 (0:00:00.303) 0:00:00.303 ********** 2026-04-20 00:37:24.268992 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.269000 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.269008 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.269015 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.269022 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.269029 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.269036 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.269043 | orchestrator | 2026-04-20 00:37:24.269050 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2026-04-20 00:37:24.269057 | orchestrator | Monday 20 April 2026 00:37:01 +0000 (0:00:00.477) 0:00:00.780 ********** 2026-04-20 00:37:24.269065 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:37:24.269086 | orchestrator | 2026-04-20 00:37:24.269093 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2026-04-20 00:37:24.269101 | orchestrator | Monday 20 April 2026 00:37:02 +0000 (0:00:01.058) 0:00:01.838 ********** 2026-04-20 00:37:24.269108 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.269115 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.269122 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.269129 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.269136 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.269144 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.269151 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.269158 | orchestrator | 2026-04-20 00:37:24.269165 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2026-04-20 00:37:24.269172 | orchestrator | Monday 20 April 2026 00:37:05 +0000 (0:00:02.725) 0:00:04.564 ********** 2026-04-20 00:37:24.269179 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.269205 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.269213 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.269220 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.269227 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.269234 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.269241 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.269248 | orchestrator | 2026-04-20 00:37:24.269256 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2026-04-20 00:37:24.269263 | orchestrator | Monday 20 April 2026 00:37:06 +0000 (0:00:01.634) 0:00:06.198 ********** 2026-04-20 00:37:24.269270 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2026-04-20 00:37:24.269278 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2026-04-20 00:37:24.269285 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2026-04-20 00:37:24.269292 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2026-04-20 00:37:24.269299 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2026-04-20 00:37:24.269306 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2026-04-20 00:37:24.269314 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2026-04-20 00:37:24.269321 | orchestrator | 2026-04-20 00:37:24.269328 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2026-04-20 00:37:24.269335 | orchestrator | Monday 20 April 2026 00:37:08 +0000 (0:00:01.167) 0:00:07.365 ********** 2026-04-20 00:37:24.269342 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:37:24.269351 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-20 00:37:24.269358 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 00:37:24.269365 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-20 00:37:24.269372 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-04-20 00:37:24.269380 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-04-20 00:37:24.269387 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-04-20 00:37:24.269401 | orchestrator | 2026-04-20 00:37:24.269421 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2026-04-20 00:37:24.269430 | orchestrator | Monday 20 April 2026 00:37:11 +0000 (0:00:03.322) 0:00:10.688 ********** 2026-04-20 00:37:24.269439 | orchestrator | changed: [testbed-manager] 2026-04-20 00:37:24.269447 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:37:24.269455 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:37:24.269463 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:37:24.269471 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:37:24.269479 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:37:24.269488 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:37:24.269496 | orchestrator | 2026-04-20 00:37:24.269504 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2026-04-20 00:37:24.269513 | orchestrator | Monday 20 April 2026 00:37:12 +0000 (0:00:01.427) 0:00:12.116 ********** 2026-04-20 00:37:24.269521 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 00:37:24.269529 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:37:24.269537 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-20 00:37:24.269545 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-04-20 00:37:24.269553 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-04-20 00:37:24.269561 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-20 00:37:24.269570 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-04-20 00:37:24.269578 | orchestrator | 2026-04-20 00:37:24.269587 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2026-04-20 00:37:24.269595 | orchestrator | Monday 20 April 2026 00:37:14 +0000 (0:00:01.539) 0:00:13.655 ********** 2026-04-20 00:37:24.269604 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.269612 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.269620 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.269628 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.269636 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.269644 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.269652 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.269660 | orchestrator | 2026-04-20 00:37:24.269669 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2026-04-20 00:37:24.269690 | orchestrator | Monday 20 April 2026 00:37:15 +0000 (0:00:00.880) 0:00:14.536 ********** 2026-04-20 00:37:24.269699 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:24.269708 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:24.269716 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:24.269723 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:24.269731 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:24.269740 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:24.269748 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:24.269756 | orchestrator | 2026-04-20 00:37:24.269763 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2026-04-20 00:37:24.269770 | orchestrator | Monday 20 April 2026 00:37:15 +0000 (0:00:00.667) 0:00:15.203 ********** 2026-04-20 00:37:24.269777 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.269784 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.269791 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.269798 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.269805 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.269812 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.269819 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.269826 | orchestrator | 2026-04-20 00:37:24.269834 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2026-04-20 00:37:24.269841 | orchestrator | Monday 20 April 2026 00:37:18 +0000 (0:00:02.026) 0:00:17.230 ********** 2026-04-20 00:37:24.269848 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:24.269855 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:24.269862 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:24.269869 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:24.269882 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:24.269889 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:24.269897 | orchestrator | changed: [testbed-manager] => (item={'src': '/opt/configuration/network/iptables.sh', 'dest': 'routable.d/iptables.sh'}) 2026-04-20 00:37:24.269904 | orchestrator | 2026-04-20 00:37:24.269912 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2026-04-20 00:37:24.269919 | orchestrator | Monday 20 April 2026 00:37:18 +0000 (0:00:00.731) 0:00:17.961 ********** 2026-04-20 00:37:24.269926 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.269933 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:37:24.269940 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:37:24.269947 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:37:24.269954 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:37:24.269961 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:37:24.269968 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:37:24.269975 | orchestrator | 2026-04-20 00:37:24.269982 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2026-04-20 00:37:24.269989 | orchestrator | Monday 20 April 2026 00:37:20 +0000 (0:00:01.466) 0:00:19.427 ********** 2026-04-20 00:37:24.269997 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:37:24.270006 | orchestrator | 2026-04-20 00:37:24.270013 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2026-04-20 00:37:24.270066 | orchestrator | Monday 20 April 2026 00:37:21 +0000 (0:00:01.223) 0:00:20.650 ********** 2026-04-20 00:37:24.270074 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.270081 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.270088 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.270095 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.270103 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.270110 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.270117 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.270124 | orchestrator | 2026-04-20 00:37:24.270131 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2026-04-20 00:37:24.270138 | orchestrator | Monday 20 April 2026 00:37:22 +0000 (0:00:01.130) 0:00:21.781 ********** 2026-04-20 00:37:24.270146 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:24.270153 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:24.270160 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:24.270167 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:24.270178 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:24.270199 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:24.270207 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:24.270214 | orchestrator | 2026-04-20 00:37:24.270221 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2026-04-20 00:37:24.270228 | orchestrator | Monday 20 April 2026 00:37:23 +0000 (0:00:00.716) 0:00:22.497 ********** 2026-04-20 00:37:24.270236 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270243 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270250 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270258 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270265 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270272 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270279 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270286 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270294 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270306 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2026-04-20 00:37:24.270314 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270321 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270328 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270335 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2026-04-20 00:37:24.270342 | orchestrator | 2026-04-20 00:37:24.270355 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2026-04-20 00:37:38.741232 | orchestrator | Monday 20 April 2026 00:37:24 +0000 (0:00:00.986) 0:00:23.483 ********** 2026-04-20 00:37:38.741347 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:38.741364 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:38.741375 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:38.741390 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:38.741409 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:38.741440 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:38.741461 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:38.741480 | orchestrator | 2026-04-20 00:37:38.741500 | orchestrator | TASK [osism.commons.network : Include vxlan interfaces] ************************ 2026-04-20 00:37:38.741519 | orchestrator | Monday 20 April 2026 00:37:24 +0000 (0:00:00.715) 0:00:24.199 ********** 2026-04-20 00:37:38.741541 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/vxlan-interfaces.yml for testbed-manager, testbed-node-0, testbed-node-5, testbed-node-1, testbed-node-4, testbed-node-2, testbed-node-3 2026-04-20 00:37:38.741564 | orchestrator | 2026-04-20 00:37:38.741584 | orchestrator | TASK [osism.commons.network : Create systemd networkd netdev files] ************ 2026-04-20 00:37:38.741604 | orchestrator | Monday 20 April 2026 00:37:29 +0000 (0:00:04.131) 0:00:28.331 ********** 2026-04-20 00:37:38.741627 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.741650 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.112.5/20']}}) 2026-04-20 00:37:38.741673 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.741694 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.10/20']}}) 2026-04-20 00:37:38.741732 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.128.5/20']}}) 2026-04-20 00:37:38.741753 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.741788 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.741845 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.741861 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.741872 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.11/20']}}) 2026-04-20 00:37:38.741884 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.13/20']}}) 2026-04-20 00:37:38.741914 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.12/20']}}) 2026-04-20 00:37:38.741927 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': ['192.168.128.15/20']}}) 2026-04-20 00:37:38.741937 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.14/20']}}) 2026-04-20 00:37:38.741948 | orchestrator | 2026-04-20 00:37:38.741960 | orchestrator | TASK [osism.commons.network : Create systemd networkd network files] *********** 2026-04-20 00:37:38.741971 | orchestrator | Monday 20 April 2026 00:37:33 +0000 (0:00:04.805) 0:00:33.136 ********** 2026-04-20 00:37:38.741982 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.112.5/20']}}) 2026-04-20 00:37:38.741993 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.742004 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.5', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'addresses': ['192.168.128.5/20']}}) 2026-04-20 00:37:38.742071 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.742086 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.742097 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.742117 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.742134 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'vni': 42, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': []}}) 2026-04-20 00:37:38.742145 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.10', 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.10/20']}}) 2026-04-20 00:37:38.742181 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.11', 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.11/20']}}) 2026-04-20 00:37:38.742194 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.14', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.14/20']}}) 2026-04-20 00:37:38.742204 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.13', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.13/20']}}) 2026-04-20 00:37:38.742230 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.12', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'addresses': ['192.168.128.12/20']}}) 2026-04-20 00:37:51.445498 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'vni': 23, 'mtu': 1350, 'local_ip': '192.168.16.15', 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'addresses': ['192.168.128.15/20']}}) 2026-04-20 00:37:51.445612 | orchestrator | 2026-04-20 00:37:51.445628 | orchestrator | TASK [osism.commons.network : Include networkd cleanup tasks] ****************** 2026-04-20 00:37:51.445641 | orchestrator | Monday 20 April 2026 00:37:39 +0000 (0:00:05.121) 0:00:38.258 ********** 2026-04-20 00:37:51.445653 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-networkd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:37:51.445665 | orchestrator | 2026-04-20 00:37:51.445695 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2026-04-20 00:37:51.445706 | orchestrator | Monday 20 April 2026 00:37:40 +0000 (0:00:01.189) 0:00:39.448 ********** 2026-04-20 00:37:51.445717 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:51.445730 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:51.445740 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:51.445751 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:51.445762 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:51.445773 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:51.445783 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:51.445794 | orchestrator | 2026-04-20 00:37:51.445805 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2026-04-20 00:37:51.445816 | orchestrator | Monday 20 April 2026 00:37:41 +0000 (0:00:01.598) 0:00:41.047 ********** 2026-04-20 00:37:51.445827 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.445840 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.445878 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.445895 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.445915 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:51.445936 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.445955 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.445974 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.445992 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.446012 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:51.446110 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.446130 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.446229 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.446242 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.446255 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:51.446268 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.446281 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.446293 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.446306 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.446318 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:51.446330 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.446342 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.446354 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.446367 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.446380 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:51.446390 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.446401 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.446412 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.446423 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.446433 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:51.446444 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.network)  2026-04-20 00:37:51.446454 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.network)  2026-04-20 00:37:51.446465 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.netdev)  2026-04-20 00:37:51.446475 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.netdev)  2026-04-20 00:37:51.446486 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:51.446497 | orchestrator | 2026-04-20 00:37:51.446507 | orchestrator | TASK [osism.commons.network : Include network extra init] ********************** 2026-04-20 00:37:51.446537 | orchestrator | Monday 20 April 2026 00:37:42 +0000 (0:00:00.685) 0:00:41.732 ********** 2026-04-20 00:37:51.446550 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/network-extra-init.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:37:51.446561 | orchestrator | 2026-04-20 00:37:51.446584 | orchestrator | TASK [osism.commons.network : Deploy network-extra-init script] **************** 2026-04-20 00:37:51.446595 | orchestrator | Monday 20 April 2026 00:37:43 +0000 (0:00:01.204) 0:00:42.936 ********** 2026-04-20 00:37:51.446605 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:51.446616 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:51.446645 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:51.446656 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:51.446667 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:51.446678 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:51.446689 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:51.446699 | orchestrator | 2026-04-20 00:37:51.446710 | orchestrator | TASK [osism.commons.network : Deploy network-extra-init systemd service] ******* 2026-04-20 00:37:51.446721 | orchestrator | Monday 20 April 2026 00:37:44 +0000 (0:00:00.755) 0:00:43.692 ********** 2026-04-20 00:37:51.446732 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:51.446743 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:51.446753 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:51.446764 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:51.446778 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:51.446797 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:51.446815 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:51.446833 | orchestrator | 2026-04-20 00:37:51.446921 | orchestrator | TASK [osism.commons.network : Enable and start network-extra-init service] ***** 2026-04-20 00:37:51.446945 | orchestrator | Monday 20 April 2026 00:37:45 +0000 (0:00:00.596) 0:00:44.289 ********** 2026-04-20 00:37:51.446965 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:51.446984 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:51.447003 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:51.447022 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:51.447041 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:51.447061 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:51.447079 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:51.447099 | orchestrator | 2026-04-20 00:37:51.447119 | orchestrator | TASK [osism.commons.network : Disable and stop network-extra-init service] ***** 2026-04-20 00:37:51.447183 | orchestrator | Monday 20 April 2026 00:37:45 +0000 (0:00:00.758) 0:00:45.047 ********** 2026-04-20 00:37:51.447204 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:51.447223 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:51.447243 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:51.447263 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:51.447284 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:51.447302 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:51.447321 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:51.447340 | orchestrator | 2026-04-20 00:37:51.447358 | orchestrator | TASK [osism.commons.network : Remove network-extra-init systemd service] ******* 2026-04-20 00:37:51.447377 | orchestrator | Monday 20 April 2026 00:37:47 +0000 (0:00:01.477) 0:00:46.525 ********** 2026-04-20 00:37:51.447396 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:51.447414 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:51.447429 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:51.447439 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:51.447450 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:51.447461 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:51.447471 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:51.447482 | orchestrator | 2026-04-20 00:37:51.447493 | orchestrator | TASK [osism.commons.network : Remove network-extra-init script] **************** 2026-04-20 00:37:51.447503 | orchestrator | Monday 20 April 2026 00:37:48 +0000 (0:00:01.026) 0:00:47.551 ********** 2026-04-20 00:37:51.447514 | orchestrator | ok: [testbed-manager] 2026-04-20 00:37:51.447533 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:37:51.447549 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:37:51.447560 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:37:51.447570 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:37:51.447581 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:37:51.447602 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:37:51.447613 | orchestrator | 2026-04-20 00:37:51.447624 | orchestrator | RUNNING HANDLER [osism.commons.network : Reload systemd-networkd] ************** 2026-04-20 00:37:51.447635 | orchestrator | Monday 20 April 2026 00:37:50 +0000 (0:00:01.948) 0:00:49.500 ********** 2026-04-20 00:37:51.447645 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:51.447656 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:51.447667 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:51.447677 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:51.447688 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:51.447699 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:51.447710 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:51.447720 | orchestrator | 2026-04-20 00:37:51.447736 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2026-04-20 00:37:51.447753 | orchestrator | Monday 20 April 2026 00:37:50 +0000 (0:00:00.566) 0:00:50.066 ********** 2026-04-20 00:37:51.447772 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:37:51.447791 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:37:51.447809 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:37:51.447828 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:37:51.447847 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:37:51.447858 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:37:51.447868 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:37:51.447879 | orchestrator | 2026-04-20 00:37:51.447890 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:37:51.447901 | orchestrator | testbed-manager : ok=25  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2026-04-20 00:37:51.447914 | orchestrator | testbed-node-0 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-20 00:37:51.447936 | orchestrator | testbed-node-1 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-20 00:37:51.612187 | orchestrator | testbed-node-2 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-20 00:37:51.612269 | orchestrator | testbed-node-3 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-20 00:37:51.612276 | orchestrator | testbed-node-4 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-20 00:37:51.612283 | orchestrator | testbed-node-5 : ok=24  changed=5  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2026-04-20 00:37:51.612289 | orchestrator | 2026-04-20 00:37:51.612295 | orchestrator | 2026-04-20 00:37:51.612300 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:37:51.612308 | orchestrator | Monday 20 April 2026 00:37:51 +0000 (0:00:00.594) 0:00:50.661 ********** 2026-04-20 00:37:51.612313 | orchestrator | =============================================================================== 2026-04-20 00:37:51.612318 | orchestrator | osism.commons.network : Create systemd networkd network files ----------- 5.12s 2026-04-20 00:37:51.612324 | orchestrator | osism.commons.network : Create systemd networkd netdev files ------------ 4.81s 2026-04-20 00:37:51.612329 | orchestrator | osism.commons.network : Include vxlan interfaces ------------------------ 4.13s 2026-04-20 00:37:51.612335 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 3.32s 2026-04-20 00:37:51.612341 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.73s 2026-04-20 00:37:51.612346 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.03s 2026-04-20 00:37:51.612352 | orchestrator | osism.commons.network : Remove network-extra-init script ---------------- 1.95s 2026-04-20 00:37:51.612379 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 1.63s 2026-04-20 00:37:51.612385 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.60s 2026-04-20 00:37:51.612391 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 1.54s 2026-04-20 00:37:51.612396 | orchestrator | osism.commons.network : Disable and stop network-extra-init service ----- 1.48s 2026-04-20 00:37:51.612402 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.47s 2026-04-20 00:37:51.612408 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.43s 2026-04-20 00:37:51.612413 | orchestrator | osism.commons.network : Include cleanup tasks --------------------------- 1.22s 2026-04-20 00:37:51.612419 | orchestrator | osism.commons.network : Include network extra init ---------------------- 1.20s 2026-04-20 00:37:51.612425 | orchestrator | osism.commons.network : Include networkd cleanup tasks ------------------ 1.19s 2026-04-20 00:37:51.612431 | orchestrator | osism.commons.network : Create required directories --------------------- 1.17s 2026-04-20 00:37:51.612436 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.13s 2026-04-20 00:37:51.612442 | orchestrator | osism.commons.network : Include type specific tasks --------------------- 1.06s 2026-04-20 00:37:51.612448 | orchestrator | osism.commons.network : Remove network-extra-init systemd service ------- 1.03s 2026-04-20 00:37:51.735243 | orchestrator | + osism apply wireguard 2026-04-20 00:38:02.864838 | orchestrator | 2026-04-20 00:38:02 | INFO  | Prepare task for execution of wireguard. 2026-04-20 00:38:02.938486 | orchestrator | 2026-04-20 00:38:02 | INFO  | Task 1f26c89f-dcc5-4831-9f57-202c5b242f86 (wireguard) was prepared for execution. 2026-04-20 00:38:02.938589 | orchestrator | 2026-04-20 00:38:02 | INFO  | It takes a moment until task 1f26c89f-dcc5-4831-9f57-202c5b242f86 (wireguard) has been started and output is visible here. 2026-04-20 00:38:19.803660 | orchestrator | 2026-04-20 00:38:19.803766 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2026-04-20 00:38:19.803781 | orchestrator | 2026-04-20 00:38:19.803792 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2026-04-20 00:38:19.803802 | orchestrator | Monday 20 April 2026 00:38:06 +0000 (0:00:00.256) 0:00:00.256 ********** 2026-04-20 00:38:19.803812 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:19.803823 | orchestrator | 2026-04-20 00:38:19.803833 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2026-04-20 00:38:19.803842 | orchestrator | Monday 20 April 2026 00:38:07 +0000 (0:00:01.571) 0:00:01.827 ********** 2026-04-20 00:38:19.803852 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.803862 | orchestrator | 2026-04-20 00:38:19.803872 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2026-04-20 00:38:19.803881 | orchestrator | Monday 20 April 2026 00:38:12 +0000 (0:00:05.289) 0:00:07.117 ********** 2026-04-20 00:38:19.803891 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.803900 | orchestrator | 2026-04-20 00:38:19.803910 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2026-04-20 00:38:19.803919 | orchestrator | Monday 20 April 2026 00:38:13 +0000 (0:00:00.506) 0:00:07.623 ********** 2026-04-20 00:38:19.803929 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.803938 | orchestrator | 2026-04-20 00:38:19.803948 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2026-04-20 00:38:19.803957 | orchestrator | Monday 20 April 2026 00:38:13 +0000 (0:00:00.442) 0:00:08.066 ********** 2026-04-20 00:38:19.803967 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:19.803977 | orchestrator | 2026-04-20 00:38:19.803986 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2026-04-20 00:38:19.803996 | orchestrator | Monday 20 April 2026 00:38:14 +0000 (0:00:00.508) 0:00:08.575 ********** 2026-04-20 00:38:19.804006 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:19.804015 | orchestrator | 2026-04-20 00:38:19.804025 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2026-04-20 00:38:19.804053 | orchestrator | Monday 20 April 2026 00:38:14 +0000 (0:00:00.407) 0:00:08.982 ********** 2026-04-20 00:38:19.804063 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:19.804073 | orchestrator | 2026-04-20 00:38:19.804082 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2026-04-20 00:38:19.804127 | orchestrator | Monday 20 April 2026 00:38:15 +0000 (0:00:00.402) 0:00:09.385 ********** 2026-04-20 00:38:19.804137 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.804147 | orchestrator | 2026-04-20 00:38:19.804156 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2026-04-20 00:38:19.804166 | orchestrator | Monday 20 April 2026 00:38:16 +0000 (0:00:01.013) 0:00:10.399 ********** 2026-04-20 00:38:19.804175 | orchestrator | changed: [testbed-manager] => (item=None) 2026-04-20 00:38:19.804185 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.804194 | orchestrator | 2026-04-20 00:38:19.804205 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2026-04-20 00:38:19.804216 | orchestrator | Monday 20 April 2026 00:38:17 +0000 (0:00:00.826) 0:00:11.225 ********** 2026-04-20 00:38:19.804227 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.804238 | orchestrator | 2026-04-20 00:38:19.804249 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2026-04-20 00:38:19.804259 | orchestrator | Monday 20 April 2026 00:38:18 +0000 (0:00:01.717) 0:00:12.942 ********** 2026-04-20 00:38:19.804270 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:19.804281 | orchestrator | 2026-04-20 00:38:19.804291 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:38:19.804303 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:38:19.804314 | orchestrator | 2026-04-20 00:38:19.804325 | orchestrator | 2026-04-20 00:38:19.804336 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:38:19.804347 | orchestrator | Monday 20 April 2026 00:38:19 +0000 (0:00:00.830) 0:00:13.773 ********** 2026-04-20 00:38:19.804358 | orchestrator | =============================================================================== 2026-04-20 00:38:19.804369 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 5.29s 2026-04-20 00:38:19.804380 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.72s 2026-04-20 00:38:19.804391 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.57s 2026-04-20 00:38:19.804401 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.01s 2026-04-20 00:38:19.804412 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.83s 2026-04-20 00:38:19.804423 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 0.83s 2026-04-20 00:38:19.804434 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.51s 2026-04-20 00:38:19.804445 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.51s 2026-04-20 00:38:19.804456 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.44s 2026-04-20 00:38:19.804466 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.41s 2026-04-20 00:38:19.804485 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.40s 2026-04-20 00:38:19.920702 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2026-04-20 00:38:19.951601 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2026-04-20 00:38:19.951710 | orchestrator | Dload Upload Total Spent Left Speed 2026-04-20 00:38:20.022699 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 15 100 15 0 0 210 0 --:--:-- --:--:-- --:--:-- 211 2026-04-20 00:38:20.034557 | orchestrator | + osism apply --environment custom workarounds 2026-04-20 00:38:21.184323 | orchestrator | 2026-04-20 00:38:21 | INFO  | Trying to run play workarounds in environment custom 2026-04-20 00:38:31.323645 | orchestrator | 2026-04-20 00:38:31 | INFO  | Prepare task for execution of workarounds. 2026-04-20 00:38:31.383845 | orchestrator | 2026-04-20 00:38:31 | INFO  | Task 3639a737-adca-421f-953e-2a67bb121774 (workarounds) was prepared for execution. 2026-04-20 00:38:31.383938 | orchestrator | 2026-04-20 00:38:31 | INFO  | It takes a moment until task 3639a737-adca-421f-953e-2a67bb121774 (workarounds) has been started and output is visible here. 2026-04-20 00:38:54.469689 | orchestrator | 2026-04-20 00:38:54.469870 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:38:54.469902 | orchestrator | 2026-04-20 00:38:54.469920 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2026-04-20 00:38:54.469937 | orchestrator | Monday 20 April 2026 00:38:34 +0000 (0:00:00.175) 0:00:00.175 ********** 2026-04-20 00:38:54.469956 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2026-04-20 00:38:54.469997 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2026-04-20 00:38:54.470223 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2026-04-20 00:38:54.470252 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2026-04-20 00:38:54.470272 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2026-04-20 00:38:54.470292 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2026-04-20 00:38:54.470312 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2026-04-20 00:38:54.470332 | orchestrator | 2026-04-20 00:38:54.470351 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2026-04-20 00:38:54.470371 | orchestrator | 2026-04-20 00:38:54.470391 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2026-04-20 00:38:54.470412 | orchestrator | Monday 20 April 2026 00:38:35 +0000 (0:00:00.643) 0:00:00.819 ********** 2026-04-20 00:38:54.470432 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:54.470456 | orchestrator | 2026-04-20 00:38:54.470478 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2026-04-20 00:38:54.470498 | orchestrator | 2026-04-20 00:38:54.470512 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2026-04-20 00:38:54.470524 | orchestrator | Monday 20 April 2026 00:38:37 +0000 (0:00:02.373) 0:00:03.192 ********** 2026-04-20 00:38:54.470537 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:38:54.470549 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:38:54.470563 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:38:54.470576 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:38:54.470587 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:38:54.470598 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:38:54.470608 | orchestrator | 2026-04-20 00:38:54.470619 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2026-04-20 00:38:54.470630 | orchestrator | 2026-04-20 00:38:54.470642 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2026-04-20 00:38:54.470652 | orchestrator | Monday 20 April 2026 00:38:39 +0000 (0:00:02.172) 0:00:05.364 ********** 2026-04-20 00:38:54.470664 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-20 00:38:54.470677 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-20 00:38:54.470688 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-20 00:38:54.470698 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-20 00:38:54.470709 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-20 00:38:54.470720 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2026-04-20 00:38:54.470757 | orchestrator | 2026-04-20 00:38:54.470769 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2026-04-20 00:38:54.470780 | orchestrator | Monday 20 April 2026 00:38:40 +0000 (0:00:01.225) 0:00:06.590 ********** 2026-04-20 00:38:54.470791 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:38:54.470802 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:38:54.470812 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:38:54.470823 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:38:54.470833 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:38:54.470844 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:38:54.470854 | orchestrator | 2026-04-20 00:38:54.470865 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2026-04-20 00:38:54.470876 | orchestrator | Monday 20 April 2026 00:38:44 +0000 (0:00:03.801) 0:00:10.391 ********** 2026-04-20 00:38:54.470886 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:38:54.470897 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:38:54.470907 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:38:54.470918 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:38:54.470943 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:38:54.470954 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:38:54.470964 | orchestrator | 2026-04-20 00:38:54.470975 | orchestrator | PLAY [Add a workaround service] ************************************************ 2026-04-20 00:38:54.470986 | orchestrator | 2026-04-20 00:38:54.470997 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2026-04-20 00:38:54.471008 | orchestrator | Monday 20 April 2026 00:38:45 +0000 (0:00:00.479) 0:00:10.870 ********** 2026-04-20 00:38:54.471019 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:54.471029 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:38:54.471040 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:38:54.471050 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:38:54.471061 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:38:54.471071 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:38:54.471081 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:38:54.471092 | orchestrator | 2026-04-20 00:38:54.471102 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2026-04-20 00:38:54.471113 | orchestrator | Monday 20 April 2026 00:38:46 +0000 (0:00:01.553) 0:00:12.424 ********** 2026-04-20 00:38:54.471177 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:54.471189 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:38:54.471200 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:38:54.471210 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:38:54.471221 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:38:54.471232 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:38:54.471268 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:38:54.471279 | orchestrator | 2026-04-20 00:38:54.471290 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2026-04-20 00:38:54.471301 | orchestrator | Monday 20 April 2026 00:38:48 +0000 (0:00:01.326) 0:00:13.750 ********** 2026-04-20 00:38:54.471311 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:54.471322 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:38:54.471332 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:38:54.471343 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:38:54.471354 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:38:54.471364 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:38:54.471375 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:38:54.471385 | orchestrator | 2026-04-20 00:38:54.471396 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2026-04-20 00:38:54.471407 | orchestrator | Monday 20 April 2026 00:38:49 +0000 (0:00:01.531) 0:00:15.282 ********** 2026-04-20 00:38:54.471417 | orchestrator | changed: [testbed-manager] 2026-04-20 00:38:54.471428 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:38:54.471438 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:38:54.471459 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:38:54.471469 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:38:54.471480 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:38:54.471491 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:38:54.471501 | orchestrator | 2026-04-20 00:38:54.471512 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2026-04-20 00:38:54.471523 | orchestrator | Monday 20 April 2026 00:38:51 +0000 (0:00:01.511) 0:00:16.794 ********** 2026-04-20 00:38:54.471533 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:38:54.471544 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:38:54.471555 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:38:54.471565 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:38:54.471576 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:38:54.471586 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:38:54.471597 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:38:54.471607 | orchestrator | 2026-04-20 00:38:54.471618 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2026-04-20 00:38:54.471629 | orchestrator | 2026-04-20 00:38:54.471639 | orchestrator | TASK [Install python3-docker] ************************************************** 2026-04-20 00:38:54.471650 | orchestrator | Monday 20 April 2026 00:38:51 +0000 (0:00:00.650) 0:00:17.444 ********** 2026-04-20 00:38:54.471661 | orchestrator | ok: [testbed-manager] 2026-04-20 00:38:54.471672 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:38:54.471682 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:38:54.471693 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:38:54.471704 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:38:54.471714 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:38:54.471725 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:38:54.471736 | orchestrator | 2026-04-20 00:38:54.471746 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:38:54.471759 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:38:54.471771 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:38:54.471782 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:38:54.471793 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:38:54.471804 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:38:54.471814 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:38:54.471825 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:38:54.471836 | orchestrator | 2026-04-20 00:38:54.471846 | orchestrator | 2026-04-20 00:38:54.471857 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:38:54.471868 | orchestrator | Monday 20 April 2026 00:38:54 +0000 (0:00:02.740) 0:00:20.184 ********** 2026-04-20 00:38:54.471887 | orchestrator | =============================================================================== 2026-04-20 00:38:54.471898 | orchestrator | Run update-ca-certificates ---------------------------------------------- 3.80s 2026-04-20 00:38:54.471909 | orchestrator | Install python3-docker -------------------------------------------------- 2.74s 2026-04-20 00:38:54.471919 | orchestrator | Apply netplan configuration --------------------------------------------- 2.37s 2026-04-20 00:38:54.471930 | orchestrator | Apply netplan configuration --------------------------------------------- 2.17s 2026-04-20 00:38:54.471948 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.55s 2026-04-20 00:38:54.471959 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.53s 2026-04-20 00:38:54.471970 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 1.51s 2026-04-20 00:38:54.471980 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.33s 2026-04-20 00:38:54.471991 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.23s 2026-04-20 00:38:54.472002 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.65s 2026-04-20 00:38:54.472012 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.64s 2026-04-20 00:38:54.472030 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.48s 2026-04-20 00:38:54.764645 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2026-04-20 00:39:06.053788 | orchestrator | 2026-04-20 00:39:06 | INFO  | Prepare task for execution of reboot. 2026-04-20 00:39:06.119898 | orchestrator | 2026-04-20 00:39:06 | INFO  | Task 490ba4ca-5b72-485e-a088-6fec5f2b2b8d (reboot) was prepared for execution. 2026-04-20 00:39:06.121533 | orchestrator | 2026-04-20 00:39:06 | INFO  | It takes a moment until task 490ba4ca-5b72-485e-a088-6fec5f2b2b8d (reboot) has been started and output is visible here. 2026-04-20 00:39:16.546575 | orchestrator | 2026-04-20 00:39:16.546716 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-20 00:39:16.546732 | orchestrator | 2026-04-20 00:39:16.546743 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-20 00:39:16.546755 | orchestrator | Monday 20 April 2026 00:39:08 +0000 (0:00:00.204) 0:00:00.204 ********** 2026-04-20 00:39:16.546765 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:39:16.546777 | orchestrator | 2026-04-20 00:39:16.546788 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-20 00:39:16.546798 | orchestrator | Monday 20 April 2026 00:39:09 +0000 (0:00:00.124) 0:00:00.328 ********** 2026-04-20 00:39:16.546809 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:39:16.546820 | orchestrator | 2026-04-20 00:39:16.546831 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-20 00:39:16.546841 | orchestrator | Monday 20 April 2026 00:39:10 +0000 (0:00:01.208) 0:00:01.537 ********** 2026-04-20 00:39:16.546852 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:39:16.546863 | orchestrator | 2026-04-20 00:39:16.546873 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-20 00:39:16.546883 | orchestrator | 2026-04-20 00:39:16.546894 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-20 00:39:16.546905 | orchestrator | Monday 20 April 2026 00:39:10 +0000 (0:00:00.093) 0:00:01.630 ********** 2026-04-20 00:39:16.546915 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:39:16.546925 | orchestrator | 2026-04-20 00:39:16.546936 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-20 00:39:16.546946 | orchestrator | Monday 20 April 2026 00:39:10 +0000 (0:00:00.095) 0:00:01.725 ********** 2026-04-20 00:39:16.546957 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:39:16.546967 | orchestrator | 2026-04-20 00:39:16.546978 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-20 00:39:16.546988 | orchestrator | Monday 20 April 2026 00:39:11 +0000 (0:00:00.996) 0:00:02.722 ********** 2026-04-20 00:39:16.546999 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:39:16.547009 | orchestrator | 2026-04-20 00:39:16.547020 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-20 00:39:16.547031 | orchestrator | 2026-04-20 00:39:16.547041 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-20 00:39:16.547052 | orchestrator | Monday 20 April 2026 00:39:11 +0000 (0:00:00.106) 0:00:02.828 ********** 2026-04-20 00:39:16.547065 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:39:16.547109 | orchestrator | 2026-04-20 00:39:16.547121 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-20 00:39:16.547133 | orchestrator | Monday 20 April 2026 00:39:11 +0000 (0:00:00.083) 0:00:02.912 ********** 2026-04-20 00:39:16.547145 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:39:16.547157 | orchestrator | 2026-04-20 00:39:16.547193 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-20 00:39:16.547206 | orchestrator | Monday 20 April 2026 00:39:12 +0000 (0:00:01.027) 0:00:03.939 ********** 2026-04-20 00:39:16.547218 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:39:16.547230 | orchestrator | 2026-04-20 00:39:16.547242 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-20 00:39:16.547254 | orchestrator | 2026-04-20 00:39:16.547266 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-20 00:39:16.547278 | orchestrator | Monday 20 April 2026 00:39:12 +0000 (0:00:00.093) 0:00:04.032 ********** 2026-04-20 00:39:16.547291 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:39:16.547303 | orchestrator | 2026-04-20 00:39:16.547315 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-20 00:39:16.547325 | orchestrator | Monday 20 April 2026 00:39:12 +0000 (0:00:00.086) 0:00:04.119 ********** 2026-04-20 00:39:16.547336 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:39:16.547347 | orchestrator | 2026-04-20 00:39:16.547374 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-20 00:39:16.547385 | orchestrator | Monday 20 April 2026 00:39:13 +0000 (0:00:00.987) 0:00:05.107 ********** 2026-04-20 00:39:16.547396 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:39:16.547407 | orchestrator | 2026-04-20 00:39:16.547417 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-20 00:39:16.547428 | orchestrator | 2026-04-20 00:39:16.547439 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-20 00:39:16.547450 | orchestrator | Monday 20 April 2026 00:39:13 +0000 (0:00:00.110) 0:00:05.217 ********** 2026-04-20 00:39:16.547461 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:39:16.547471 | orchestrator | 2026-04-20 00:39:16.547482 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-20 00:39:16.547493 | orchestrator | Monday 20 April 2026 00:39:14 +0000 (0:00:00.201) 0:00:05.418 ********** 2026-04-20 00:39:16.547504 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:39:16.547515 | orchestrator | 2026-04-20 00:39:16.547525 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-20 00:39:16.547536 | orchestrator | Monday 20 April 2026 00:39:15 +0000 (0:00:01.003) 0:00:06.422 ********** 2026-04-20 00:39:16.547547 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:39:16.547558 | orchestrator | 2026-04-20 00:39:16.547569 | orchestrator | PLAY [Reboot systems] ********************************************************** 2026-04-20 00:39:16.547579 | orchestrator | 2026-04-20 00:39:16.547590 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2026-04-20 00:39:16.547601 | orchestrator | Monday 20 April 2026 00:39:15 +0000 (0:00:00.094) 0:00:06.517 ********** 2026-04-20 00:39:16.547612 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:39:16.547623 | orchestrator | 2026-04-20 00:39:16.547634 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2026-04-20 00:39:16.547645 | orchestrator | Monday 20 April 2026 00:39:15 +0000 (0:00:00.095) 0:00:06.613 ********** 2026-04-20 00:39:16.547656 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:39:16.547667 | orchestrator | 2026-04-20 00:39:16.547678 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2026-04-20 00:39:16.547689 | orchestrator | Monday 20 April 2026 00:39:16 +0000 (0:00:00.970) 0:00:07.583 ********** 2026-04-20 00:39:16.547718 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:39:16.547730 | orchestrator | 2026-04-20 00:39:16.547741 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:39:16.547753 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:39:16.547773 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:39:16.547785 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:39:16.547796 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:39:16.547806 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:39:16.547817 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:39:16.547828 | orchestrator | 2026-04-20 00:39:16.547839 | orchestrator | 2026-04-20 00:39:16.547849 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:39:16.547860 | orchestrator | Monday 20 April 2026 00:39:16 +0000 (0:00:00.035) 0:00:07.619 ********** 2026-04-20 00:39:16.547871 | orchestrator | =============================================================================== 2026-04-20 00:39:16.547881 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 6.19s 2026-04-20 00:39:16.547892 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.69s 2026-04-20 00:39:16.547903 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.53s 2026-04-20 00:39:16.659463 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2026-04-20 00:39:27.780321 | orchestrator | 2026-04-20 00:39:27 | INFO  | Prepare task for execution of wait-for-connection. 2026-04-20 00:39:27.850632 | orchestrator | 2026-04-20 00:39:27 | INFO  | Task 269b2dcc-3bd8-4455-acb7-8f34a3cd7895 (wait-for-connection) was prepared for execution. 2026-04-20 00:39:27.850729 | orchestrator | 2026-04-20 00:39:27 | INFO  | It takes a moment until task 269b2dcc-3bd8-4455-acb7-8f34a3cd7895 (wait-for-connection) has been started and output is visible here. 2026-04-20 00:39:42.632573 | orchestrator | 2026-04-20 00:39:42.632690 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2026-04-20 00:39:42.632705 | orchestrator | 2026-04-20 00:39:42.632717 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2026-04-20 00:39:42.632729 | orchestrator | Monday 20 April 2026 00:39:30 +0000 (0:00:00.322) 0:00:00.322 ********** 2026-04-20 00:39:42.632741 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:39:42.632753 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:39:42.632764 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:39:42.632774 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:39:42.632785 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:39:42.632796 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:39:42.632806 | orchestrator | 2026-04-20 00:39:42.632817 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:39:42.632829 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:39:42.632842 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:39:42.632852 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:39:42.632864 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:39:42.632874 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:39:42.632916 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:39:42.632927 | orchestrator | 2026-04-20 00:39:42.632938 | orchestrator | 2026-04-20 00:39:42.632949 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:39:42.632960 | orchestrator | Monday 20 April 2026 00:39:42 +0000 (0:00:11.522) 0:00:11.845 ********** 2026-04-20 00:39:42.632971 | orchestrator | =============================================================================== 2026-04-20 00:39:42.632982 | orchestrator | Wait until remote system is reachable ---------------------------------- 11.52s 2026-04-20 00:39:42.753487 | orchestrator | + osism apply hddtemp 2026-04-20 00:39:53.907178 | orchestrator | 2026-04-20 00:39:53 | INFO  | Prepare task for execution of hddtemp. 2026-04-20 00:39:53.977803 | orchestrator | 2026-04-20 00:39:53 | INFO  | Task f3fc8405-1b5b-46a0-b8b5-bec3ab67b4a9 (hddtemp) was prepared for execution. 2026-04-20 00:39:53.977904 | orchestrator | 2026-04-20 00:39:53 | INFO  | It takes a moment until task f3fc8405-1b5b-46a0-b8b5-bec3ab67b4a9 (hddtemp) has been started and output is visible here. 2026-04-20 00:40:20.929436 | orchestrator | 2026-04-20 00:40:20.929529 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2026-04-20 00:40:20.929540 | orchestrator | 2026-04-20 00:40:20.929547 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2026-04-20 00:40:20.929554 | orchestrator | Monday 20 April 2026 00:39:57 +0000 (0:00:00.330) 0:00:00.330 ********** 2026-04-20 00:40:20.929562 | orchestrator | ok: [testbed-manager] 2026-04-20 00:40:20.929570 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:40:20.929577 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:40:20.929583 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:40:20.929590 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:40:20.929597 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:40:20.929603 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:40:20.929610 | orchestrator | 2026-04-20 00:40:20.929617 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2026-04-20 00:40:20.929623 | orchestrator | Monday 20 April 2026 00:39:58 +0000 (0:00:00.629) 0:00:00.959 ********** 2026-04-20 00:40:20.929632 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:40:20.929641 | orchestrator | 2026-04-20 00:40:20.929648 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2026-04-20 00:40:20.929655 | orchestrator | Monday 20 April 2026 00:39:59 +0000 (0:00:01.061) 0:00:02.020 ********** 2026-04-20 00:40:20.929661 | orchestrator | ok: [testbed-manager] 2026-04-20 00:40:20.929668 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:40:20.929674 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:40:20.929681 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:40:20.929687 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:40:20.929694 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:40:20.929716 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:40:20.929722 | orchestrator | 2026-04-20 00:40:20.929729 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2026-04-20 00:40:20.929736 | orchestrator | Monday 20 April 2026 00:40:01 +0000 (0:00:02.500) 0:00:04.521 ********** 2026-04-20 00:40:20.929742 | orchestrator | changed: [testbed-manager] 2026-04-20 00:40:20.929751 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:40:20.929758 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:40:20.929764 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:40:20.929771 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:40:20.929777 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:40:20.929784 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:40:20.929790 | orchestrator | 2026-04-20 00:40:20.929814 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2026-04-20 00:40:20.929821 | orchestrator | Monday 20 April 2026 00:40:02 +0000 (0:00:00.890) 0:00:05.411 ********** 2026-04-20 00:40:20.929828 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:40:20.929835 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:40:20.929841 | orchestrator | ok: [testbed-manager] 2026-04-20 00:40:20.929848 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:40:20.929854 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:40:20.929860 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:40:20.929867 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:40:20.929873 | orchestrator | 2026-04-20 00:40:20.929880 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2026-04-20 00:40:20.929886 | orchestrator | Monday 20 April 2026 00:40:03 +0000 (0:00:01.239) 0:00:06.650 ********** 2026-04-20 00:40:20.929893 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:40:20.929900 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:40:20.929906 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:40:20.929912 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:40:20.929919 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:40:20.929925 | orchestrator | changed: [testbed-manager] 2026-04-20 00:40:20.929935 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:40:20.929942 | orchestrator | 2026-04-20 00:40:20.929949 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2026-04-20 00:40:20.929955 | orchestrator | Monday 20 April 2026 00:40:04 +0000 (0:00:00.590) 0:00:07.241 ********** 2026-04-20 00:40:20.929962 | orchestrator | changed: [testbed-manager] 2026-04-20 00:40:20.929970 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:40:20.929977 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:40:20.929984 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:40:20.929991 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:40:20.929999 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:40:20.930006 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:40:20.930053 | orchestrator | 2026-04-20 00:40:20.930061 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2026-04-20 00:40:20.930069 | orchestrator | Monday 20 April 2026 00:40:18 +0000 (0:00:13.731) 0:00:20.973 ********** 2026-04-20 00:40:20.930076 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:40:20.930084 | orchestrator | 2026-04-20 00:40:20.930092 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2026-04-20 00:40:20.930099 | orchestrator | Monday 20 April 2026 00:40:18 +0000 (0:00:00.933) 0:00:21.906 ********** 2026-04-20 00:40:20.930106 | orchestrator | changed: [testbed-manager] 2026-04-20 00:40:20.930114 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:40:20.930121 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:40:20.930129 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:40:20.930136 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:40:20.930143 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:40:20.930150 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:40:20.930158 | orchestrator | 2026-04-20 00:40:20.930165 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:40:20.930174 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:40:20.930194 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:40:20.930203 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:40:20.930210 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:40:20.930224 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:40:20.930232 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:40:20.930240 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:40:20.930247 | orchestrator | 2026-04-20 00:40:20.930255 | orchestrator | 2026-04-20 00:40:20.930262 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:40:20.930270 | orchestrator | Monday 20 April 2026 00:40:20 +0000 (0:00:01.722) 0:00:23.628 ********** 2026-04-20 00:40:20.930277 | orchestrator | =============================================================================== 2026-04-20 00:40:20.930284 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 13.73s 2026-04-20 00:40:20.930292 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 2.50s 2026-04-20 00:40:20.930299 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 1.72s 2026-04-20 00:40:20.930307 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.24s 2026-04-20 00:40:20.930336 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.06s 2026-04-20 00:40:20.930343 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 0.93s 2026-04-20 00:40:20.930350 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 0.89s 2026-04-20 00:40:20.930356 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.63s 2026-04-20 00:40:20.930363 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.59s 2026-04-20 00:40:21.053380 | orchestrator | ++ semver 10.0.0 7.1.1 2026-04-20 00:40:21.089600 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-20 00:40:21.089666 | orchestrator | + sudo systemctl restart manager.service 2026-04-20 00:40:38.000031 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2026-04-20 00:40:38.000105 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2026-04-20 00:40:38.000112 | orchestrator | + local max_attempts=60 2026-04-20 00:40:38.000117 | orchestrator | + local name=ceph-ansible 2026-04-20 00:40:38.000121 | orchestrator | + local attempt_num=1 2026-04-20 00:40:38.000125 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:40:38.037514 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:40:38.037596 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:40:38.037607 | orchestrator | + sleep 5 2026-04-20 00:40:43.042791 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:40:43.087906 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:40:43.088025 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:40:43.088050 | orchestrator | + sleep 5 2026-04-20 00:40:48.088982 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:40:48.129884 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:40:48.129978 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:40:48.129993 | orchestrator | + sleep 5 2026-04-20 00:40:53.134672 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:40:53.167064 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:40:53.167169 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:40:53.167185 | orchestrator | + sleep 5 2026-04-20 00:40:58.170867 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:40:58.202228 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:40:58.202323 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:40:58.202337 | orchestrator | + sleep 5 2026-04-20 00:41:03.206223 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:03.254947 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:03.255043 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:03.255085 | orchestrator | + sleep 5 2026-04-20 00:41:08.259349 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:08.291858 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:08.291960 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:08.291973 | orchestrator | + sleep 5 2026-04-20 00:41:13.295181 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:13.324924 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:13.325002 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:13.325012 | orchestrator | + sleep 5 2026-04-20 00:41:18.328701 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:18.367763 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:18.367940 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:18.367967 | orchestrator | + sleep 5 2026-04-20 00:41:23.372952 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:23.415496 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:23.415631 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:23.415651 | orchestrator | + sleep 5 2026-04-20 00:41:28.420670 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:28.460805 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:28.460903 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:28.460915 | orchestrator | + sleep 5 2026-04-20 00:41:33.465273 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:33.501483 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:33.501591 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:33.501611 | orchestrator | + sleep 5 2026-04-20 00:41:38.505146 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:38.532424 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:38.532519 | orchestrator | + (( attempt_num++ == max_attempts )) 2026-04-20 00:41:38.532531 | orchestrator | + sleep 5 2026-04-20 00:41:43.536273 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2026-04-20 00:41:43.571959 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:43.572039 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2026-04-20 00:41:43.572048 | orchestrator | + local max_attempts=60 2026-04-20 00:41:43.572055 | orchestrator | + local name=kolla-ansible 2026-04-20 00:41:43.572061 | orchestrator | + local attempt_num=1 2026-04-20 00:41:43.572995 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2026-04-20 00:41:43.609727 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:43.609816 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2026-04-20 00:41:43.609827 | orchestrator | + local max_attempts=60 2026-04-20 00:41:43.609836 | orchestrator | + local name=osism-ansible 2026-04-20 00:41:43.609844 | orchestrator | + local attempt_num=1 2026-04-20 00:41:43.610083 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2026-04-20 00:41:43.638824 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2026-04-20 00:41:43.638920 | orchestrator | + [[ true == \t\r\u\e ]] 2026-04-20 00:41:43.638933 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2026-04-20 00:41:43.802107 | orchestrator | ARA in ceph-ansible already disabled. 2026-04-20 00:41:43.936164 | orchestrator | ARA in kolla-ansible already disabled. 2026-04-20 00:41:44.082527 | orchestrator | ARA in osism-ansible already disabled. 2026-04-20 00:41:44.200423 | orchestrator | ARA in osism-kubernetes already disabled. 2026-04-20 00:41:44.200965 | orchestrator | + osism apply gather-facts 2026-04-20 00:41:55.408186 | orchestrator | 2026-04-20 00:41:55 | INFO  | Prepare task for execution of gather-facts. 2026-04-20 00:41:55.481053 | orchestrator | 2026-04-20 00:41:55 | INFO  | Task 5ce032be-0aa0-4879-81ae-b969c3807091 (gather-facts) was prepared for execution. 2026-04-20 00:41:55.481145 | orchestrator | 2026-04-20 00:41:55 | INFO  | It takes a moment until task 5ce032be-0aa0-4879-81ae-b969c3807091 (gather-facts) has been started and output is visible here. 2026-04-20 00:42:06.741564 | orchestrator | 2026-04-20 00:42:06.741682 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-20 00:42:06.741699 | orchestrator | 2026-04-20 00:42:06.741712 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-20 00:42:06.741751 | orchestrator | Monday 20 April 2026 00:41:58 +0000 (0:00:00.216) 0:00:00.216 ********** 2026-04-20 00:42:06.741763 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:42:06.741775 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:42:06.741786 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:42:06.741798 | orchestrator | ok: [testbed-manager] 2026-04-20 00:42:06.741901 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:42:06.741924 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:42:06.741940 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:42:06.741950 | orchestrator | 2026-04-20 00:42:06.741961 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-04-20 00:42:06.741972 | orchestrator | 2026-04-20 00:42:06.741982 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-04-20 00:42:06.741993 | orchestrator | Monday 20 April 2026 00:42:06 +0000 (0:00:07.831) 0:00:08.048 ********** 2026-04-20 00:42:06.742004 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:42:06.742068 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:42:06.742082 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:42:06.742093 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:42:06.742107 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:42:06.742119 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:42:06.742132 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:42:06.742181 | orchestrator | 2026-04-20 00:42:06.742194 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:42:06.742207 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742235 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742249 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742262 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742275 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742287 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742300 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2026-04-20 00:42:06.742312 | orchestrator | 2026-04-20 00:42:06.742325 | orchestrator | 2026-04-20 00:42:06.742338 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:42:06.742350 | orchestrator | Monday 20 April 2026 00:42:06 +0000 (0:00:00.525) 0:00:08.574 ********** 2026-04-20 00:42:06.742363 | orchestrator | =============================================================================== 2026-04-20 00:42:06.742376 | orchestrator | Gathers facts about hosts ----------------------------------------------- 7.83s 2026-04-20 00:42:06.742388 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.53s 2026-04-20 00:42:06.862618 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2026-04-20 00:42:06.872865 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2026-04-20 00:42:06.882499 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2026-04-20 00:42:06.891983 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2026-04-20 00:42:06.901589 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2026-04-20 00:42:06.911125 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/320-openstack-minimal.sh /usr/local/bin/deploy-openstack-minimal 2026-04-20 00:42:06.918873 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2026-04-20 00:42:06.928104 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2026-04-20 00:42:06.937456 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2026-04-20 00:42:06.946695 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade-manager.sh /usr/local/bin/upgrade-manager 2026-04-20 00:42:06.956476 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2026-04-20 00:42:06.965908 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2026-04-20 00:42:06.977037 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2026-04-20 00:42:06.986193 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2026-04-20 00:42:06.996110 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/320-openstack-minimal.sh /usr/local/bin/upgrade-openstack-minimal 2026-04-20 00:42:07.005738 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2026-04-20 00:42:07.015353 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2026-04-20 00:42:07.024737 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2026-04-20 00:42:07.035843 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2026-04-20 00:42:07.044595 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amphora-image.sh /usr/local/bin/bootstrap-octavia 2026-04-20 00:42:07.053951 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2026-04-20 00:42:07.061932 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2026-04-20 00:42:07.069735 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2026-04-20 00:42:07.081018 | orchestrator | + [[ false == \t\r\u\e ]] 2026-04-20 00:42:07.265158 | orchestrator | ok: Runtime: 0:23:31.698205 2026-04-20 00:42:07.373582 | 2026-04-20 00:42:07.373722 | TASK [Deploy services] 2026-04-20 00:42:07.907249 | orchestrator | skipping: Conditional result was False 2026-04-20 00:42:07.916633 | 2026-04-20 00:42:07.916763 | TASK [Deploy in a nutshell] 2026-04-20 00:42:08.650081 | orchestrator | + set -e 2026-04-20 00:42:08.650228 | orchestrator | + source /opt/configuration/scripts/include.sh 2026-04-20 00:42:08.650241 | orchestrator | ++ export INTERACTIVE=false 2026-04-20 00:42:08.650250 | orchestrator | ++ INTERACTIVE=false 2026-04-20 00:42:08.650256 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2026-04-20 00:42:08.650260 | orchestrator | ++ OSISM_APPLY_RETRY=1 2026-04-20 00:42:08.650266 | orchestrator | + source /opt/manager-vars.sh 2026-04-20 00:42:08.650289 | orchestrator | ++ export NUMBER_OF_NODES=6 2026-04-20 00:42:08.650301 | orchestrator | ++ NUMBER_OF_NODES=6 2026-04-20 00:42:08.650306 | orchestrator | ++ export CEPH_VERSION=reef 2026-04-20 00:42:08.650313 | orchestrator | ++ CEPH_VERSION=reef 2026-04-20 00:42:08.650317 | orchestrator | ++ export CONFIGURATION_VERSION=main 2026-04-20 00:42:08.650324 | orchestrator | ++ CONFIGURATION_VERSION=main 2026-04-20 00:42:08.650329 | orchestrator | ++ export MANAGER_VERSION=10.0.0 2026-04-20 00:42:08.650337 | orchestrator | ++ MANAGER_VERSION=10.0.0 2026-04-20 00:42:08.650341 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2026-04-20 00:42:08.650347 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2026-04-20 00:42:08.650360 | orchestrator | ++ export ARA=false 2026-04-20 00:42:08.650364 | orchestrator | ++ ARA=false 2026-04-20 00:42:08.650368 | orchestrator | ++ export DEPLOY_MODE=manager 2026-04-20 00:42:08.650373 | orchestrator | ++ DEPLOY_MODE=manager 2026-04-20 00:42:08.650377 | orchestrator | ++ export TEMPEST=true 2026-04-20 00:42:08.651845 | orchestrator | 2026-04-20 00:42:08.651858 | orchestrator | # PULL IMAGES 2026-04-20 00:42:08.651862 | orchestrator | 2026-04-20 00:42:08.651867 | orchestrator | ++ TEMPEST=true 2026-04-20 00:42:08.651871 | orchestrator | ++ export IS_ZUUL=true 2026-04-20 00:42:08.651874 | orchestrator | ++ IS_ZUUL=true 2026-04-20 00:42:08.651878 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:42:08.651883 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.117 2026-04-20 00:42:08.651886 | orchestrator | ++ export EXTERNAL_API=false 2026-04-20 00:42:08.651890 | orchestrator | ++ EXTERNAL_API=false 2026-04-20 00:42:08.651894 | orchestrator | ++ export IMAGE_USER=ubuntu 2026-04-20 00:42:08.651898 | orchestrator | ++ IMAGE_USER=ubuntu 2026-04-20 00:42:08.651902 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2026-04-20 00:42:08.651906 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2026-04-20 00:42:08.651910 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2026-04-20 00:42:08.651918 | orchestrator | ++ CEPH_STACK=ceph-ansible 2026-04-20 00:42:08.651922 | orchestrator | + echo 2026-04-20 00:42:08.651926 | orchestrator | + echo '# PULL IMAGES' 2026-04-20 00:42:08.651930 | orchestrator | + echo 2026-04-20 00:42:08.652086 | orchestrator | ++ semver 10.0.0 7.0.0 2026-04-20 00:42:08.706453 | orchestrator | + [[ 1 -ge 0 ]] 2026-04-20 00:42:08.706523 | orchestrator | + osism apply --no-wait -r 2 -e custom pull-images 2026-04-20 00:42:09.947603 | orchestrator | 2026-04-20 00:42:09 | INFO  | Trying to run play pull-images in environment custom 2026-04-20 00:42:20.040228 | orchestrator | 2026-04-20 00:42:20 | INFO  | Prepare task for execution of pull-images. 2026-04-20 00:42:20.117460 | orchestrator | 2026-04-20 00:42:20 | INFO  | Task e77d9bb0-7f8d-4853-9182-3fc77c1ed3bd (pull-images) was prepared for execution. 2026-04-20 00:42:20.117557 | orchestrator | 2026-04-20 00:42:20 | INFO  | Task e77d9bb0-7f8d-4853-9182-3fc77c1ed3bd is running in background. No more output. Check ARA for logs. 2026-04-20 00:42:21.666000 | orchestrator | 2026-04-20 00:42:21 | INFO  | Trying to run play wipe-partitions in environment custom 2026-04-20 00:42:31.712069 | orchestrator | 2026-04-20 00:42:31 | INFO  | Prepare task for execution of wipe-partitions. 2026-04-20 00:42:31.789370 | orchestrator | 2026-04-20 00:42:31 | INFO  | Task 996ba825-a169-4209-942b-0ac1a70b4560 (wipe-partitions) was prepared for execution. 2026-04-20 00:42:31.789438 | orchestrator | 2026-04-20 00:42:31 | INFO  | It takes a moment until task 996ba825-a169-4209-942b-0ac1a70b4560 (wipe-partitions) has been started and output is visible here. 2026-04-20 00:42:43.297787 | orchestrator | 2026-04-20 00:42:43.297981 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2026-04-20 00:42:43.298001 | orchestrator | 2026-04-20 00:42:43.298069 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2026-04-20 00:42:43.298092 | orchestrator | Monday 20 April 2026 00:42:34 +0000 (0:00:00.148) 0:00:00.148 ********** 2026-04-20 00:42:43.298108 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:42:43.298153 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:42:43.298166 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:42:43.298176 | orchestrator | 2026-04-20 00:42:43.298194 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2026-04-20 00:42:43.298204 | orchestrator | Monday 20 April 2026 00:42:35 +0000 (0:00:00.922) 0:00:01.070 ********** 2026-04-20 00:42:43.298214 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:42:43.298235 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:42:43.298246 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:42:43.298255 | orchestrator | 2026-04-20 00:42:43.298272 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2026-04-20 00:42:43.298286 | orchestrator | Monday 20 April 2026 00:42:35 +0000 (0:00:00.228) 0:00:01.298 ********** 2026-04-20 00:42:43.298298 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:42:43.298318 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:42:43.298329 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:42:43.298341 | orchestrator | 2026-04-20 00:42:43.298358 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2026-04-20 00:42:43.298395 | orchestrator | Monday 20 April 2026 00:42:36 +0000 (0:00:00.618) 0:00:01.917 ********** 2026-04-20 00:42:43.298409 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:42:43.298425 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:42:43.298439 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:42:43.298450 | orchestrator | 2026-04-20 00:42:43.298466 | orchestrator | TASK [Check device availability] *********************************************** 2026-04-20 00:42:43.298481 | orchestrator | Monday 20 April 2026 00:42:36 +0000 (0:00:00.275) 0:00:02.193 ********** 2026-04-20 00:42:43.298492 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2026-04-20 00:42:43.298510 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2026-04-20 00:42:43.298526 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2026-04-20 00:42:43.298537 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2026-04-20 00:42:43.298548 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2026-04-20 00:42:43.298560 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2026-04-20 00:42:43.298578 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2026-04-20 00:42:43.298589 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2026-04-20 00:42:43.298605 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2026-04-20 00:42:43.298618 | orchestrator | 2026-04-20 00:42:43.298630 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2026-04-20 00:42:43.298647 | orchestrator | Monday 20 April 2026 00:42:38 +0000 (0:00:01.339) 0:00:03.532 ********** 2026-04-20 00:42:43.298657 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2026-04-20 00:42:43.298670 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2026-04-20 00:42:43.298686 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2026-04-20 00:42:43.298697 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2026-04-20 00:42:43.298714 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2026-04-20 00:42:43.298724 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2026-04-20 00:42:43.298736 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2026-04-20 00:42:43.298753 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2026-04-20 00:42:43.298763 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2026-04-20 00:42:43.298777 | orchestrator | 2026-04-20 00:42:43.298791 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2026-04-20 00:42:43.298801 | orchestrator | Monday 20 April 2026 00:42:39 +0000 (0:00:01.360) 0:00:04.893 ********** 2026-04-20 00:42:43.298814 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2026-04-20 00:42:43.298830 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2026-04-20 00:42:43.298841 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2026-04-20 00:42:43.298867 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2026-04-20 00:42:43.298878 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2026-04-20 00:42:43.298905 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2026-04-20 00:42:43.298915 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2026-04-20 00:42:43.298997 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2026-04-20 00:42:43.299012 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2026-04-20 00:42:43.299029 | orchestrator | 2026-04-20 00:42:43.299039 | orchestrator | TASK [Reload udev rules] ******************************************************* 2026-04-20 00:42:43.299049 | orchestrator | Monday 20 April 2026 00:42:41 +0000 (0:00:02.150) 0:00:07.043 ********** 2026-04-20 00:42:43.299066 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:42:43.299076 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:42:43.299086 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:42:43.299103 | orchestrator | 2026-04-20 00:42:43.299115 | orchestrator | TASK [Request device events from the kernel] *********************************** 2026-04-20 00:42:43.299124 | orchestrator | Monday 20 April 2026 00:42:42 +0000 (0:00:00.646) 0:00:07.690 ********** 2026-04-20 00:42:43.299137 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:42:43.299152 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:42:43.299162 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:42:43.299171 | orchestrator | 2026-04-20 00:42:43.299182 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:42:43.299200 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:42:43.299213 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:42:43.299249 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:42:43.299266 | orchestrator | 2026-04-20 00:42:43.299278 | orchestrator | 2026-04-20 00:42:43.299288 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:42:43.299305 | orchestrator | Monday 20 April 2026 00:42:43 +0000 (0:00:00.800) 0:00:08.491 ********** 2026-04-20 00:42:43.299316 | orchestrator | =============================================================================== 2026-04-20 00:42:43.299326 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.15s 2026-04-20 00:42:43.299342 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.36s 2026-04-20 00:42:43.299354 | orchestrator | Check device availability ----------------------------------------------- 1.34s 2026-04-20 00:42:43.299367 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 0.92s 2026-04-20 00:42:43.299382 | orchestrator | Request device events from the kernel ----------------------------------- 0.80s 2026-04-20 00:42:43.299392 | orchestrator | Reload udev rules ------------------------------------------------------- 0.65s 2026-04-20 00:42:43.299404 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.62s 2026-04-20 00:42:43.299420 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.28s 2026-04-20 00:42:43.299430 | orchestrator | Remove all rook related logical devices --------------------------------- 0.23s 2026-04-20 00:42:54.705932 | orchestrator | 2026-04-20 00:42:54 | INFO  | Prepare task for execution of facts. 2026-04-20 00:42:54.787147 | orchestrator | 2026-04-20 00:42:54 | INFO  | Task ebec1ab9-548e-49d7-9b95-21cfa3f203d7 (facts) was prepared for execution. 2026-04-20 00:42:54.789138 | orchestrator | 2026-04-20 00:42:54 | INFO  | It takes a moment until task ebec1ab9-548e-49d7-9b95-21cfa3f203d7 (facts) has been started and output is visible here. 2026-04-20 00:43:07.141963 | orchestrator | 2026-04-20 00:43:07.142201 | orchestrator | PLAY [Apply role facts] ******************************************************** 2026-04-20 00:43:07.142217 | orchestrator | 2026-04-20 00:43:07.142226 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-04-20 00:43:07.142258 | orchestrator | Monday 20 April 2026 00:42:58 +0000 (0:00:00.336) 0:00:00.336 ********** 2026-04-20 00:43:07.142267 | orchestrator | ok: [testbed-manager] 2026-04-20 00:43:07.142277 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:43:07.142285 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:43:07.142293 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:43:07.142300 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:43:07.142311 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:43:07.142319 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:43:07.142327 | orchestrator | 2026-04-20 00:43:07.142335 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-04-20 00:43:07.142343 | orchestrator | Monday 20 April 2026 00:42:59 +0000 (0:00:01.301) 0:00:01.638 ********** 2026-04-20 00:43:07.142351 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:43:07.142359 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:43:07.142367 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:43:07.142375 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:43:07.142383 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:07.142390 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:07.142398 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:07.142406 | orchestrator | 2026-04-20 00:43:07.142414 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-20 00:43:07.142421 | orchestrator | 2026-04-20 00:43:07.142429 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-20 00:43:07.142437 | orchestrator | Monday 20 April 2026 00:43:00 +0000 (0:00:01.193) 0:00:02.831 ********** 2026-04-20 00:43:07.142445 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:43:07.142453 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:43:07.142461 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:43:07.142469 | orchestrator | ok: [testbed-manager] 2026-04-20 00:43:07.142477 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:43:07.142485 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:43:07.142492 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:43:07.142500 | orchestrator | 2026-04-20 00:43:07.142508 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-04-20 00:43:07.142516 | orchestrator | 2026-04-20 00:43:07.142537 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-04-20 00:43:07.142545 | orchestrator | Monday 20 April 2026 00:43:06 +0000 (0:00:05.828) 0:00:08.660 ********** 2026-04-20 00:43:07.142553 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:43:07.142561 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:43:07.142569 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:43:07.142577 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:43:07.142585 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:07.142592 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:07.142600 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:07.142608 | orchestrator | 2026-04-20 00:43:07.142616 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:43:07.142624 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142634 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142642 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142650 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142659 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142666 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142682 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:43:07.142690 | orchestrator | 2026-04-20 00:43:07.142698 | orchestrator | 2026-04-20 00:43:07.142706 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:43:07.142714 | orchestrator | Monday 20 April 2026 00:43:06 +0000 (0:00:00.489) 0:00:09.149 ********** 2026-04-20 00:43:07.142722 | orchestrator | =============================================================================== 2026-04-20 00:43:07.142730 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.83s 2026-04-20 00:43:07.142738 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.30s 2026-04-20 00:43:07.142746 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.19s 2026-04-20 00:43:07.142753 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.49s 2026-04-20 00:43:08.589184 | orchestrator | 2026-04-20 00:43:08 | INFO  | Prepare task for execution of ceph-configure-lvm-volumes. 2026-04-20 00:43:08.652873 | orchestrator | 2026-04-20 00:43:08 | INFO  | Task 3c1c7641-a682-4b34-b440-71fc7fa36f9d (ceph-configure-lvm-volumes) was prepared for execution. 2026-04-20 00:43:08.652975 | orchestrator | 2026-04-20 00:43:08 | INFO  | It takes a moment until task 3c1c7641-a682-4b34-b440-71fc7fa36f9d (ceph-configure-lvm-volumes) has been started and output is visible here. 2026-04-20 00:43:20.189554 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-20 00:43:20.189665 | orchestrator | 2.16.14 2026-04-20 00:43:20.189683 | orchestrator | 2026-04-20 00:43:20.189696 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-04-20 00:43:20.189708 | orchestrator | 2026-04-20 00:43:20.189718 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-20 00:43:20.189730 | orchestrator | Monday 20 April 2026 00:43:13 +0000 (0:00:00.294) 0:00:00.294 ********** 2026-04-20 00:43:20.189741 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-04-20 00:43:20.189752 | orchestrator | 2026-04-20 00:43:20.189763 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-20 00:43:20.189774 | orchestrator | Monday 20 April 2026 00:43:13 +0000 (0:00:00.245) 0:00:00.540 ********** 2026-04-20 00:43:20.189785 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:43:20.189796 | orchestrator | 2026-04-20 00:43:20.189807 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.189818 | orchestrator | Monday 20 April 2026 00:43:13 +0000 (0:00:00.219) 0:00:00.760 ********** 2026-04-20 00:43:20.189829 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2026-04-20 00:43:20.189840 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2026-04-20 00:43:20.189850 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2026-04-20 00:43:20.189861 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2026-04-20 00:43:20.189872 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2026-04-20 00:43:20.189882 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2026-04-20 00:43:20.189893 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2026-04-20 00:43:20.189916 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2026-04-20 00:43:20.189927 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2026-04-20 00:43:20.189938 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2026-04-20 00:43:20.189972 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2026-04-20 00:43:20.189983 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2026-04-20 00:43:20.189994 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2026-04-20 00:43:20.190005 | orchestrator | 2026-04-20 00:43:20.190102 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190117 | orchestrator | Monday 20 April 2026 00:43:13 +0000 (0:00:00.351) 0:00:01.111 ********** 2026-04-20 00:43:20.190130 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190143 | orchestrator | 2026-04-20 00:43:20.190155 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190167 | orchestrator | Monday 20 April 2026 00:43:14 +0000 (0:00:00.459) 0:00:01.571 ********** 2026-04-20 00:43:20.190180 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190192 | orchestrator | 2026-04-20 00:43:20.190205 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190217 | orchestrator | Monday 20 April 2026 00:43:14 +0000 (0:00:00.180) 0:00:01.752 ********** 2026-04-20 00:43:20.190233 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190245 | orchestrator | 2026-04-20 00:43:20.190257 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190270 | orchestrator | Monday 20 April 2026 00:43:14 +0000 (0:00:00.176) 0:00:01.928 ********** 2026-04-20 00:43:20.190282 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190295 | orchestrator | 2026-04-20 00:43:20.190308 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190321 | orchestrator | Monday 20 April 2026 00:43:14 +0000 (0:00:00.184) 0:00:02.113 ********** 2026-04-20 00:43:20.190334 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190346 | orchestrator | 2026-04-20 00:43:20.190358 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190371 | orchestrator | Monday 20 April 2026 00:43:15 +0000 (0:00:00.175) 0:00:02.289 ********** 2026-04-20 00:43:20.190383 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190395 | orchestrator | 2026-04-20 00:43:20.190409 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190423 | orchestrator | Monday 20 April 2026 00:43:15 +0000 (0:00:00.198) 0:00:02.488 ********** 2026-04-20 00:43:20.190435 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190446 | orchestrator | 2026-04-20 00:43:20.190457 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190467 | orchestrator | Monday 20 April 2026 00:43:15 +0000 (0:00:00.191) 0:00:02.680 ********** 2026-04-20 00:43:20.190478 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190489 | orchestrator | 2026-04-20 00:43:20.190499 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190510 | orchestrator | Monday 20 April 2026 00:43:15 +0000 (0:00:00.193) 0:00:02.873 ********** 2026-04-20 00:43:20.190521 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e) 2026-04-20 00:43:20.190533 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e) 2026-04-20 00:43:20.190543 | orchestrator | 2026-04-20 00:43:20.190554 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190583 | orchestrator | Monday 20 April 2026 00:43:16 +0000 (0:00:00.396) 0:00:03.270 ********** 2026-04-20 00:43:20.190595 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11) 2026-04-20 00:43:20.190606 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11) 2026-04-20 00:43:20.190617 | orchestrator | 2026-04-20 00:43:20.190627 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190638 | orchestrator | Monday 20 April 2026 00:43:16 +0000 (0:00:00.393) 0:00:03.663 ********** 2026-04-20 00:43:20.190657 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11) 2026-04-20 00:43:20.190668 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11) 2026-04-20 00:43:20.190679 | orchestrator | 2026-04-20 00:43:20.190690 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190701 | orchestrator | Monday 20 April 2026 00:43:17 +0000 (0:00:00.611) 0:00:04.275 ********** 2026-04-20 00:43:20.190711 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39) 2026-04-20 00:43:20.190722 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39) 2026-04-20 00:43:20.190733 | orchestrator | 2026-04-20 00:43:20.190743 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:20.190760 | orchestrator | Monday 20 April 2026 00:43:17 +0000 (0:00:00.620) 0:00:04.896 ********** 2026-04-20 00:43:20.190771 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-20 00:43:20.190782 | orchestrator | 2026-04-20 00:43:20.190792 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.190803 | orchestrator | Monday 20 April 2026 00:43:18 +0000 (0:00:00.739) 0:00:05.635 ********** 2026-04-20 00:43:20.190814 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2026-04-20 00:43:20.190824 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2026-04-20 00:43:20.190835 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2026-04-20 00:43:20.190846 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2026-04-20 00:43:20.190857 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2026-04-20 00:43:20.190867 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2026-04-20 00:43:20.190878 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2026-04-20 00:43:20.190888 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2026-04-20 00:43:20.190899 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2026-04-20 00:43:20.190910 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2026-04-20 00:43:20.190920 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2026-04-20 00:43:20.190931 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2026-04-20 00:43:20.190942 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2026-04-20 00:43:20.190952 | orchestrator | 2026-04-20 00:43:20.190963 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.190974 | orchestrator | Monday 20 April 2026 00:43:18 +0000 (0:00:00.405) 0:00:06.041 ********** 2026-04-20 00:43:20.190984 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.190995 | orchestrator | 2026-04-20 00:43:20.191006 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.191016 | orchestrator | Monday 20 April 2026 00:43:19 +0000 (0:00:00.204) 0:00:06.245 ********** 2026-04-20 00:43:20.191027 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.191068 | orchestrator | 2026-04-20 00:43:20.191080 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.191091 | orchestrator | Monday 20 April 2026 00:43:19 +0000 (0:00:00.192) 0:00:06.438 ********** 2026-04-20 00:43:20.191101 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.191112 | orchestrator | 2026-04-20 00:43:20.191128 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.191145 | orchestrator | Monday 20 April 2026 00:43:19 +0000 (0:00:00.176) 0:00:06.614 ********** 2026-04-20 00:43:20.191155 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.191166 | orchestrator | 2026-04-20 00:43:20.191177 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.191188 | orchestrator | Monday 20 April 2026 00:43:19 +0000 (0:00:00.195) 0:00:06.810 ********** 2026-04-20 00:43:20.191198 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.191209 | orchestrator | 2026-04-20 00:43:20.191220 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.191230 | orchestrator | Monday 20 April 2026 00:43:19 +0000 (0:00:00.190) 0:00:07.000 ********** 2026-04-20 00:43:20.191241 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.191252 | orchestrator | 2026-04-20 00:43:20.191262 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:20.191273 | orchestrator | Monday 20 April 2026 00:43:20 +0000 (0:00:00.189) 0:00:07.190 ********** 2026-04-20 00:43:20.191284 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:20.191295 | orchestrator | 2026-04-20 00:43:20.191311 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:27.272346 | orchestrator | Monday 20 April 2026 00:43:20 +0000 (0:00:00.178) 0:00:07.368 ********** 2026-04-20 00:43:27.272461 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272477 | orchestrator | 2026-04-20 00:43:27.272490 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:27.272502 | orchestrator | Monday 20 April 2026 00:43:20 +0000 (0:00:00.191) 0:00:07.560 ********** 2026-04-20 00:43:27.272513 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2026-04-20 00:43:27.272525 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2026-04-20 00:43:27.272537 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2026-04-20 00:43:27.272547 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2026-04-20 00:43:27.272558 | orchestrator | 2026-04-20 00:43:27.272569 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:27.272580 | orchestrator | Monday 20 April 2026 00:43:21 +0000 (0:00:00.924) 0:00:08.484 ********** 2026-04-20 00:43:27.272591 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272601 | orchestrator | 2026-04-20 00:43:27.272612 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:27.272623 | orchestrator | Monday 20 April 2026 00:43:21 +0000 (0:00:00.182) 0:00:08.667 ********** 2026-04-20 00:43:27.272634 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272644 | orchestrator | 2026-04-20 00:43:27.272655 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:27.272666 | orchestrator | Monday 20 April 2026 00:43:21 +0000 (0:00:00.210) 0:00:08.878 ********** 2026-04-20 00:43:27.272677 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272688 | orchestrator | 2026-04-20 00:43:27.272699 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:27.272710 | orchestrator | Monday 20 April 2026 00:43:21 +0000 (0:00:00.181) 0:00:09.060 ********** 2026-04-20 00:43:27.272720 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272731 | orchestrator | 2026-04-20 00:43:27.272742 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-04-20 00:43:27.272753 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.186) 0:00:09.246 ********** 2026-04-20 00:43:27.272764 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2026-04-20 00:43:27.272775 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2026-04-20 00:43:27.272786 | orchestrator | 2026-04-20 00:43:27.272799 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-04-20 00:43:27.272810 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.153) 0:00:09.399 ********** 2026-04-20 00:43:27.272821 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272858 | orchestrator | 2026-04-20 00:43:27.272870 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-04-20 00:43:27.272883 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.114) 0:00:09.514 ********** 2026-04-20 00:43:27.272896 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272908 | orchestrator | 2026-04-20 00:43:27.272920 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-04-20 00:43:27.272933 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.129) 0:00:09.643 ********** 2026-04-20 00:43:27.272946 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.272957 | orchestrator | 2026-04-20 00:43:27.272970 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-04-20 00:43:27.272982 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.167) 0:00:09.811 ********** 2026-04-20 00:43:27.272995 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:43:27.273007 | orchestrator | 2026-04-20 00:43:27.273020 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-04-20 00:43:27.273033 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.123) 0:00:09.935 ********** 2026-04-20 00:43:27.273045 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '4264b90b-a777-529d-80cd-078215cd7b61'}}) 2026-04-20 00:43:27.273058 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '0c7195b4-6e55-5dce-81dc-250aafa1626c'}}) 2026-04-20 00:43:27.273104 | orchestrator | 2026-04-20 00:43:27.273121 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-04-20 00:43:27.273134 | orchestrator | Monday 20 April 2026 00:43:22 +0000 (0:00:00.150) 0:00:10.085 ********** 2026-04-20 00:43:27.273148 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '4264b90b-a777-529d-80cd-078215cd7b61'}})  2026-04-20 00:43:27.273176 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '0c7195b4-6e55-5dce-81dc-250aafa1626c'}})  2026-04-20 00:43:27.273189 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273201 | orchestrator | 2026-04-20 00:43:27.273213 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-04-20 00:43:27.273226 | orchestrator | Monday 20 April 2026 00:43:23 +0000 (0:00:00.144) 0:00:10.229 ********** 2026-04-20 00:43:27.273238 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '4264b90b-a777-529d-80cd-078215cd7b61'}})  2026-04-20 00:43:27.273249 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '0c7195b4-6e55-5dce-81dc-250aafa1626c'}})  2026-04-20 00:43:27.273260 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273270 | orchestrator | 2026-04-20 00:43:27.273281 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-04-20 00:43:27.273303 | orchestrator | Monday 20 April 2026 00:43:23 +0000 (0:00:00.142) 0:00:10.372 ********** 2026-04-20 00:43:27.273315 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '4264b90b-a777-529d-80cd-078215cd7b61'}})  2026-04-20 00:43:27.273342 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '0c7195b4-6e55-5dce-81dc-250aafa1626c'}})  2026-04-20 00:43:27.273354 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273365 | orchestrator | 2026-04-20 00:43:27.273376 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-04-20 00:43:27.273386 | orchestrator | Monday 20 April 2026 00:43:23 +0000 (0:00:00.298) 0:00:10.670 ********** 2026-04-20 00:43:27.273397 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:43:27.273408 | orchestrator | 2026-04-20 00:43:27.273427 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-04-20 00:43:27.273439 | orchestrator | Monday 20 April 2026 00:43:23 +0000 (0:00:00.128) 0:00:10.799 ********** 2026-04-20 00:43:27.273449 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:43:27.273460 | orchestrator | 2026-04-20 00:43:27.273480 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-04-20 00:43:27.273491 | orchestrator | Monday 20 April 2026 00:43:23 +0000 (0:00:00.136) 0:00:10.935 ********** 2026-04-20 00:43:27.273501 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273512 | orchestrator | 2026-04-20 00:43:27.273523 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-04-20 00:43:27.273534 | orchestrator | Monday 20 April 2026 00:43:23 +0000 (0:00:00.136) 0:00:11.071 ********** 2026-04-20 00:43:27.273545 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273555 | orchestrator | 2026-04-20 00:43:27.273566 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-04-20 00:43:27.273577 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.121) 0:00:11.193 ********** 2026-04-20 00:43:27.273587 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273598 | orchestrator | 2026-04-20 00:43:27.273609 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-04-20 00:43:27.273619 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.118) 0:00:11.311 ********** 2026-04-20 00:43:27.273630 | orchestrator | ok: [testbed-node-3] => { 2026-04-20 00:43:27.273641 | orchestrator |  "ceph_osd_devices": { 2026-04-20 00:43:27.273652 | orchestrator |  "sdb": { 2026-04-20 00:43:27.273664 | orchestrator |  "osd_lvm_uuid": "4264b90b-a777-529d-80cd-078215cd7b61" 2026-04-20 00:43:27.273675 | orchestrator |  }, 2026-04-20 00:43:27.273686 | orchestrator |  "sdc": { 2026-04-20 00:43:27.273697 | orchestrator |  "osd_lvm_uuid": "0c7195b4-6e55-5dce-81dc-250aafa1626c" 2026-04-20 00:43:27.273708 | orchestrator |  } 2026-04-20 00:43:27.273719 | orchestrator |  } 2026-04-20 00:43:27.273730 | orchestrator | } 2026-04-20 00:43:27.273741 | orchestrator | 2026-04-20 00:43:27.273751 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-04-20 00:43:27.273762 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.119) 0:00:11.430 ********** 2026-04-20 00:43:27.273773 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273783 | orchestrator | 2026-04-20 00:43:27.273794 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-04-20 00:43:27.273804 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.118) 0:00:11.549 ********** 2026-04-20 00:43:27.273815 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273826 | orchestrator | 2026-04-20 00:43:27.273837 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-04-20 00:43:27.273847 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.125) 0:00:11.674 ********** 2026-04-20 00:43:27.273858 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:43:27.273868 | orchestrator | 2026-04-20 00:43:27.273879 | orchestrator | TASK [Print configuration data] ************************************************ 2026-04-20 00:43:27.273890 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.123) 0:00:11.798 ********** 2026-04-20 00:43:27.273901 | orchestrator | changed: [testbed-node-3] => { 2026-04-20 00:43:27.273911 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-04-20 00:43:27.273922 | orchestrator |  "ceph_osd_devices": { 2026-04-20 00:43:27.273933 | orchestrator |  "sdb": { 2026-04-20 00:43:27.273944 | orchestrator |  "osd_lvm_uuid": "4264b90b-a777-529d-80cd-078215cd7b61" 2026-04-20 00:43:27.273955 | orchestrator |  }, 2026-04-20 00:43:27.273966 | orchestrator |  "sdc": { 2026-04-20 00:43:27.273977 | orchestrator |  "osd_lvm_uuid": "0c7195b4-6e55-5dce-81dc-250aafa1626c" 2026-04-20 00:43:27.273987 | orchestrator |  } 2026-04-20 00:43:27.273998 | orchestrator |  }, 2026-04-20 00:43:27.274009 | orchestrator |  "lvm_volumes": [ 2026-04-20 00:43:27.274137 | orchestrator |  { 2026-04-20 00:43:27.274159 | orchestrator |  "data": "osd-block-4264b90b-a777-529d-80cd-078215cd7b61", 2026-04-20 00:43:27.274179 | orchestrator |  "data_vg": "ceph-4264b90b-a777-529d-80cd-078215cd7b61" 2026-04-20 00:43:27.274215 | orchestrator |  }, 2026-04-20 00:43:27.274234 | orchestrator |  { 2026-04-20 00:43:27.274250 | orchestrator |  "data": "osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c", 2026-04-20 00:43:27.274261 | orchestrator |  "data_vg": "ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c" 2026-04-20 00:43:27.274272 | orchestrator |  } 2026-04-20 00:43:27.274282 | orchestrator |  ] 2026-04-20 00:43:27.274293 | orchestrator |  } 2026-04-20 00:43:27.274304 | orchestrator | } 2026-04-20 00:43:27.274315 | orchestrator | 2026-04-20 00:43:27.274326 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-04-20 00:43:27.274344 | orchestrator | Monday 20 April 2026 00:43:24 +0000 (0:00:00.183) 0:00:11.982 ********** 2026-04-20 00:43:27.274355 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-04-20 00:43:27.274365 | orchestrator | 2026-04-20 00:43:27.274376 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-04-20 00:43:27.274387 | orchestrator | 2026-04-20 00:43:27.274397 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-20 00:43:27.274413 | orchestrator | Monday 20 April 2026 00:43:26 +0000 (0:00:02.029) 0:00:14.011 ********** 2026-04-20 00:43:27.274436 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-04-20 00:43:27.274465 | orchestrator | 2026-04-20 00:43:27.274480 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-20 00:43:27.274497 | orchestrator | Monday 20 April 2026 00:43:27 +0000 (0:00:00.234) 0:00:14.246 ********** 2026-04-20 00:43:27.274514 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:43:27.274531 | orchestrator | 2026-04-20 00:43:27.274559 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.032639 | orchestrator | Monday 20 April 2026 00:43:27 +0000 (0:00:00.207) 0:00:14.453 ********** 2026-04-20 00:43:34.032744 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2026-04-20 00:43:34.032760 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2026-04-20 00:43:34.032773 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2026-04-20 00:43:34.032785 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2026-04-20 00:43:34.032796 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2026-04-20 00:43:34.032808 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2026-04-20 00:43:34.032819 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2026-04-20 00:43:34.032831 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2026-04-20 00:43:34.032847 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2026-04-20 00:43:34.032859 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2026-04-20 00:43:34.032871 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2026-04-20 00:43:34.032882 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2026-04-20 00:43:34.032893 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2026-04-20 00:43:34.032904 | orchestrator | 2026-04-20 00:43:34.032917 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.032928 | orchestrator | Monday 20 April 2026 00:43:27 +0000 (0:00:00.350) 0:00:14.803 ********** 2026-04-20 00:43:34.032940 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.032952 | orchestrator | 2026-04-20 00:43:34.032964 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.032975 | orchestrator | Monday 20 April 2026 00:43:27 +0000 (0:00:00.181) 0:00:14.985 ********** 2026-04-20 00:43:34.032986 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033025 | orchestrator | 2026-04-20 00:43:34.033037 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033049 | orchestrator | Monday 20 April 2026 00:43:27 +0000 (0:00:00.177) 0:00:15.162 ********** 2026-04-20 00:43:34.033060 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033071 | orchestrator | 2026-04-20 00:43:34.033082 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033122 | orchestrator | Monday 20 April 2026 00:43:28 +0000 (0:00:00.191) 0:00:15.354 ********** 2026-04-20 00:43:34.033133 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033144 | orchestrator | 2026-04-20 00:43:34.033155 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033167 | orchestrator | Monday 20 April 2026 00:43:28 +0000 (0:00:00.184) 0:00:15.538 ********** 2026-04-20 00:43:34.033179 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033192 | orchestrator | 2026-04-20 00:43:34.033204 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033216 | orchestrator | Monday 20 April 2026 00:43:28 +0000 (0:00:00.168) 0:00:15.707 ********** 2026-04-20 00:43:34.033229 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033241 | orchestrator | 2026-04-20 00:43:34.033253 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033267 | orchestrator | Monday 20 April 2026 00:43:29 +0000 (0:00:00.523) 0:00:16.230 ********** 2026-04-20 00:43:34.033279 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033291 | orchestrator | 2026-04-20 00:43:34.033303 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033333 | orchestrator | Monday 20 April 2026 00:43:29 +0000 (0:00:00.240) 0:00:16.471 ********** 2026-04-20 00:43:34.033347 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033360 | orchestrator | 2026-04-20 00:43:34.033372 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033385 | orchestrator | Monday 20 April 2026 00:43:29 +0000 (0:00:00.194) 0:00:16.665 ********** 2026-04-20 00:43:34.033398 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501) 2026-04-20 00:43:34.033411 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501) 2026-04-20 00:43:34.033422 | orchestrator | 2026-04-20 00:43:34.033433 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033444 | orchestrator | Monday 20 April 2026 00:43:29 +0000 (0:00:00.404) 0:00:17.070 ********** 2026-04-20 00:43:34.033455 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c) 2026-04-20 00:43:34.033466 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c) 2026-04-20 00:43:34.033477 | orchestrator | 2026-04-20 00:43:34.033487 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033498 | orchestrator | Monday 20 April 2026 00:43:30 +0000 (0:00:00.367) 0:00:17.437 ********** 2026-04-20 00:43:34.033509 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e) 2026-04-20 00:43:34.033520 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e) 2026-04-20 00:43:34.033530 | orchestrator | 2026-04-20 00:43:34.033541 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033570 | orchestrator | Monday 20 April 2026 00:43:30 +0000 (0:00:00.376) 0:00:17.814 ********** 2026-04-20 00:43:34.033582 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435) 2026-04-20 00:43:34.033593 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435) 2026-04-20 00:43:34.033604 | orchestrator | 2026-04-20 00:43:34.033614 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:34.033635 | orchestrator | Monday 20 April 2026 00:43:30 +0000 (0:00:00.368) 0:00:18.183 ********** 2026-04-20 00:43:34.033646 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-20 00:43:34.033657 | orchestrator | 2026-04-20 00:43:34.033668 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.033679 | orchestrator | Monday 20 April 2026 00:43:31 +0000 (0:00:00.291) 0:00:18.474 ********** 2026-04-20 00:43:34.033690 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2026-04-20 00:43:34.033700 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2026-04-20 00:43:34.033711 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2026-04-20 00:43:34.033722 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2026-04-20 00:43:34.033739 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2026-04-20 00:43:34.033758 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2026-04-20 00:43:34.033778 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2026-04-20 00:43:34.033807 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2026-04-20 00:43:34.033827 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2026-04-20 00:43:34.033846 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2026-04-20 00:43:34.033864 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2026-04-20 00:43:34.033883 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2026-04-20 00:43:34.033900 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2026-04-20 00:43:34.033916 | orchestrator | 2026-04-20 00:43:34.033933 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.033953 | orchestrator | Monday 20 April 2026 00:43:31 +0000 (0:00:00.329) 0:00:18.804 ********** 2026-04-20 00:43:34.033972 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.033990 | orchestrator | 2026-04-20 00:43:34.034009 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034154 | orchestrator | Monday 20 April 2026 00:43:31 +0000 (0:00:00.182) 0:00:18.986 ********** 2026-04-20 00:43:34.034167 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034177 | orchestrator | 2026-04-20 00:43:34.034188 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034199 | orchestrator | Monday 20 April 2026 00:43:32 +0000 (0:00:00.444) 0:00:19.430 ********** 2026-04-20 00:43:34.034210 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034220 | orchestrator | 2026-04-20 00:43:34.034231 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034252 | orchestrator | Monday 20 April 2026 00:43:32 +0000 (0:00:00.181) 0:00:19.612 ********** 2026-04-20 00:43:34.034263 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034274 | orchestrator | 2026-04-20 00:43:34.034285 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034295 | orchestrator | Monday 20 April 2026 00:43:32 +0000 (0:00:00.174) 0:00:19.786 ********** 2026-04-20 00:43:34.034306 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034316 | orchestrator | 2026-04-20 00:43:34.034327 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034338 | orchestrator | Monday 20 April 2026 00:43:32 +0000 (0:00:00.196) 0:00:19.983 ********** 2026-04-20 00:43:34.034349 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034359 | orchestrator | 2026-04-20 00:43:34.034370 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034392 | orchestrator | Monday 20 April 2026 00:43:32 +0000 (0:00:00.183) 0:00:20.166 ********** 2026-04-20 00:43:34.034402 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034413 | orchestrator | 2026-04-20 00:43:34.034423 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034434 | orchestrator | Monday 20 April 2026 00:43:33 +0000 (0:00:00.166) 0:00:20.332 ********** 2026-04-20 00:43:34.034445 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:34.034456 | orchestrator | 2026-04-20 00:43:34.034466 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034477 | orchestrator | Monday 20 April 2026 00:43:33 +0000 (0:00:00.182) 0:00:20.515 ********** 2026-04-20 00:43:34.034487 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2026-04-20 00:43:34.034499 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2026-04-20 00:43:34.034510 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2026-04-20 00:43:34.034521 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2026-04-20 00:43:34.034531 | orchestrator | 2026-04-20 00:43:34.034542 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:34.034553 | orchestrator | Monday 20 April 2026 00:43:33 +0000 (0:00:00.586) 0:00:21.101 ********** 2026-04-20 00:43:34.034564 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.515706 | orchestrator | 2026-04-20 00:43:39.515839 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:39.515862 | orchestrator | Monday 20 April 2026 00:43:34 +0000 (0:00:00.182) 0:00:21.283 ********** 2026-04-20 00:43:39.515880 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.515899 | orchestrator | 2026-04-20 00:43:39.515916 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:39.515933 | orchestrator | Monday 20 April 2026 00:43:34 +0000 (0:00:00.171) 0:00:21.455 ********** 2026-04-20 00:43:39.515951 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.515969 | orchestrator | 2026-04-20 00:43:39.515986 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:39.516004 | orchestrator | Monday 20 April 2026 00:43:34 +0000 (0:00:00.177) 0:00:21.633 ********** 2026-04-20 00:43:39.516021 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.516038 | orchestrator | 2026-04-20 00:43:39.516055 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-04-20 00:43:39.516073 | orchestrator | Monday 20 April 2026 00:43:34 +0000 (0:00:00.171) 0:00:21.804 ********** 2026-04-20 00:43:39.516090 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2026-04-20 00:43:39.516223 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2026-04-20 00:43:39.516249 | orchestrator | 2026-04-20 00:43:39.516268 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-04-20 00:43:39.516288 | orchestrator | Monday 20 April 2026 00:43:34 +0000 (0:00:00.282) 0:00:22.087 ********** 2026-04-20 00:43:39.516310 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.516330 | orchestrator | 2026-04-20 00:43:39.516349 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-04-20 00:43:39.516368 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.112) 0:00:22.200 ********** 2026-04-20 00:43:39.516386 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.516404 | orchestrator | 2026-04-20 00:43:39.516423 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-04-20 00:43:39.516445 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.116) 0:00:22.316 ********** 2026-04-20 00:43:39.516464 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.516483 | orchestrator | 2026-04-20 00:43:39.516501 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-04-20 00:43:39.516519 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.118) 0:00:22.435 ********** 2026-04-20 00:43:39.516537 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:43:39.516592 | orchestrator | 2026-04-20 00:43:39.516613 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-04-20 00:43:39.516632 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.111) 0:00:22.547 ********** 2026-04-20 00:43:39.516653 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '7b8b741f-ff85-57a0-9457-c04aa474e6a9'}}) 2026-04-20 00:43:39.516672 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a3c07e85-95b7-5759-bf4d-00aad97d3561'}}) 2026-04-20 00:43:39.516691 | orchestrator | 2026-04-20 00:43:39.516711 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-04-20 00:43:39.516731 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.147) 0:00:22.695 ********** 2026-04-20 00:43:39.516752 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '7b8b741f-ff85-57a0-9457-c04aa474e6a9'}})  2026-04-20 00:43:39.516774 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a3c07e85-95b7-5759-bf4d-00aad97d3561'}})  2026-04-20 00:43:39.516792 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.516809 | orchestrator | 2026-04-20 00:43:39.516826 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-04-20 00:43:39.516844 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.157) 0:00:22.852 ********** 2026-04-20 00:43:39.516862 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '7b8b741f-ff85-57a0-9457-c04aa474e6a9'}})  2026-04-20 00:43:39.516903 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a3c07e85-95b7-5759-bf4d-00aad97d3561'}})  2026-04-20 00:43:39.516923 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.516942 | orchestrator | 2026-04-20 00:43:39.516960 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-04-20 00:43:39.516978 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.138) 0:00:22.991 ********** 2026-04-20 00:43:39.516996 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '7b8b741f-ff85-57a0-9457-c04aa474e6a9'}})  2026-04-20 00:43:39.517015 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a3c07e85-95b7-5759-bf4d-00aad97d3561'}})  2026-04-20 00:43:39.517034 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517051 | orchestrator | 2026-04-20 00:43:39.517068 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-04-20 00:43:39.517086 | orchestrator | Monday 20 April 2026 00:43:35 +0000 (0:00:00.140) 0:00:23.132 ********** 2026-04-20 00:43:39.517104 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:43:39.517149 | orchestrator | 2026-04-20 00:43:39.517167 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-04-20 00:43:39.517186 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.143) 0:00:23.276 ********** 2026-04-20 00:43:39.517204 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:43:39.517221 | orchestrator | 2026-04-20 00:43:39.517239 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-04-20 00:43:39.517257 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.127) 0:00:23.404 ********** 2026-04-20 00:43:39.517301 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517322 | orchestrator | 2026-04-20 00:43:39.517341 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-04-20 00:43:39.517358 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.116) 0:00:23.520 ********** 2026-04-20 00:43:39.517375 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517392 | orchestrator | 2026-04-20 00:43:39.517411 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-04-20 00:43:39.517429 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.254) 0:00:23.774 ********** 2026-04-20 00:43:39.517447 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517465 | orchestrator | 2026-04-20 00:43:39.517499 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-04-20 00:43:39.517516 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.120) 0:00:23.894 ********** 2026-04-20 00:43:39.517535 | orchestrator | ok: [testbed-node-4] => { 2026-04-20 00:43:39.517553 | orchestrator |  "ceph_osd_devices": { 2026-04-20 00:43:39.517571 | orchestrator |  "sdb": { 2026-04-20 00:43:39.517590 | orchestrator |  "osd_lvm_uuid": "7b8b741f-ff85-57a0-9457-c04aa474e6a9" 2026-04-20 00:43:39.517607 | orchestrator |  }, 2026-04-20 00:43:39.517626 | orchestrator |  "sdc": { 2026-04-20 00:43:39.517645 | orchestrator |  "osd_lvm_uuid": "a3c07e85-95b7-5759-bf4d-00aad97d3561" 2026-04-20 00:43:39.517662 | orchestrator |  } 2026-04-20 00:43:39.517680 | orchestrator |  } 2026-04-20 00:43:39.517700 | orchestrator | } 2026-04-20 00:43:39.517719 | orchestrator | 2026-04-20 00:43:39.517736 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-04-20 00:43:39.517753 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.125) 0:00:24.020 ********** 2026-04-20 00:43:39.517770 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517788 | orchestrator | 2026-04-20 00:43:39.517807 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-04-20 00:43:39.517827 | orchestrator | Monday 20 April 2026 00:43:36 +0000 (0:00:00.124) 0:00:24.144 ********** 2026-04-20 00:43:39.517845 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517863 | orchestrator | 2026-04-20 00:43:39.517881 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-04-20 00:43:39.517900 | orchestrator | Monday 20 April 2026 00:43:37 +0000 (0:00:00.115) 0:00:24.260 ********** 2026-04-20 00:43:39.517918 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:43:39.517937 | orchestrator | 2026-04-20 00:43:39.517956 | orchestrator | TASK [Print configuration data] ************************************************ 2026-04-20 00:43:39.517974 | orchestrator | Monday 20 April 2026 00:43:37 +0000 (0:00:00.121) 0:00:24.382 ********** 2026-04-20 00:43:39.517992 | orchestrator | changed: [testbed-node-4] => { 2026-04-20 00:43:39.518010 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-04-20 00:43:39.518255 | orchestrator |  "ceph_osd_devices": { 2026-04-20 00:43:39.518276 | orchestrator |  "sdb": { 2026-04-20 00:43:39.518295 | orchestrator |  "osd_lvm_uuid": "7b8b741f-ff85-57a0-9457-c04aa474e6a9" 2026-04-20 00:43:39.518313 | orchestrator |  }, 2026-04-20 00:43:39.518332 | orchestrator |  "sdc": { 2026-04-20 00:43:39.518349 | orchestrator |  "osd_lvm_uuid": "a3c07e85-95b7-5759-bf4d-00aad97d3561" 2026-04-20 00:43:39.518365 | orchestrator |  } 2026-04-20 00:43:39.518383 | orchestrator |  }, 2026-04-20 00:43:39.518400 | orchestrator |  "lvm_volumes": [ 2026-04-20 00:43:39.518413 | orchestrator |  { 2026-04-20 00:43:39.518424 | orchestrator |  "data": "osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9", 2026-04-20 00:43:39.518434 | orchestrator |  "data_vg": "ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9" 2026-04-20 00:43:39.518443 | orchestrator |  }, 2026-04-20 00:43:39.518453 | orchestrator |  { 2026-04-20 00:43:39.518463 | orchestrator |  "data": "osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561", 2026-04-20 00:43:39.518473 | orchestrator |  "data_vg": "ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561" 2026-04-20 00:43:39.518482 | orchestrator |  } 2026-04-20 00:43:39.518492 | orchestrator |  ] 2026-04-20 00:43:39.518501 | orchestrator |  } 2026-04-20 00:43:39.518511 | orchestrator | } 2026-04-20 00:43:39.518521 | orchestrator | 2026-04-20 00:43:39.518530 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-04-20 00:43:39.518540 | orchestrator | Monday 20 April 2026 00:43:37 +0000 (0:00:00.172) 0:00:24.555 ********** 2026-04-20 00:43:39.518550 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-04-20 00:43:39.518559 | orchestrator | 2026-04-20 00:43:39.518569 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2026-04-20 00:43:39.518593 | orchestrator | 2026-04-20 00:43:39.518603 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-20 00:43:39.518693 | orchestrator | Monday 20 April 2026 00:43:38 +0000 (0:00:01.007) 0:00:25.562 ********** 2026-04-20 00:43:39.518703 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-04-20 00:43:39.518713 | orchestrator | 2026-04-20 00:43:39.518723 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-20 00:43:39.518733 | orchestrator | Monday 20 April 2026 00:43:38 +0000 (0:00:00.398) 0:00:25.960 ********** 2026-04-20 00:43:39.518743 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:43:39.518753 | orchestrator | 2026-04-20 00:43:39.518762 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:39.518772 | orchestrator | Monday 20 April 2026 00:43:39 +0000 (0:00:00.464) 0:00:26.425 ********** 2026-04-20 00:43:39.518782 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2026-04-20 00:43:39.518792 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2026-04-20 00:43:39.518801 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2026-04-20 00:43:39.518824 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2026-04-20 00:43:39.518835 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2026-04-20 00:43:39.518860 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2026-04-20 00:43:46.590717 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2026-04-20 00:43:46.590834 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2026-04-20 00:43:46.590853 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2026-04-20 00:43:46.590882 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2026-04-20 00:43:46.590900 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2026-04-20 00:43:46.590912 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2026-04-20 00:43:46.590924 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2026-04-20 00:43:46.590935 | orchestrator | 2026-04-20 00:43:46.590948 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.590962 | orchestrator | Monday 20 April 2026 00:43:39 +0000 (0:00:00.354) 0:00:26.780 ********** 2026-04-20 00:43:46.590974 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.590986 | orchestrator | 2026-04-20 00:43:46.590998 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591010 | orchestrator | Monday 20 April 2026 00:43:39 +0000 (0:00:00.186) 0:00:26.966 ********** 2026-04-20 00:43:46.591021 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591032 | orchestrator | 2026-04-20 00:43:46.591044 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591056 | orchestrator | Monday 20 April 2026 00:43:39 +0000 (0:00:00.174) 0:00:27.140 ********** 2026-04-20 00:43:46.591066 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591077 | orchestrator | 2026-04-20 00:43:46.591088 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591167 | orchestrator | Monday 20 April 2026 00:43:40 +0000 (0:00:00.208) 0:00:27.348 ********** 2026-04-20 00:43:46.591186 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591198 | orchestrator | 2026-04-20 00:43:46.591212 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591225 | orchestrator | Monday 20 April 2026 00:43:40 +0000 (0:00:00.160) 0:00:27.508 ********** 2026-04-20 00:43:46.591238 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591271 | orchestrator | 2026-04-20 00:43:46.591282 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591293 | orchestrator | Monday 20 April 2026 00:43:40 +0000 (0:00:00.188) 0:00:27.697 ********** 2026-04-20 00:43:46.591303 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591314 | orchestrator | 2026-04-20 00:43:46.591324 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591333 | orchestrator | Monday 20 April 2026 00:43:40 +0000 (0:00:00.174) 0:00:27.871 ********** 2026-04-20 00:43:46.591344 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591353 | orchestrator | 2026-04-20 00:43:46.591364 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591374 | orchestrator | Monday 20 April 2026 00:43:40 +0000 (0:00:00.179) 0:00:28.051 ********** 2026-04-20 00:43:46.591384 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591393 | orchestrator | 2026-04-20 00:43:46.591403 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591413 | orchestrator | Monday 20 April 2026 00:43:41 +0000 (0:00:00.181) 0:00:28.232 ********** 2026-04-20 00:43:46.591423 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203) 2026-04-20 00:43:46.591434 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203) 2026-04-20 00:43:46.591443 | orchestrator | 2026-04-20 00:43:46.591454 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591463 | orchestrator | Monday 20 April 2026 00:43:41 +0000 (0:00:00.515) 0:00:28.748 ********** 2026-04-20 00:43:46.591475 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560) 2026-04-20 00:43:46.591491 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560) 2026-04-20 00:43:46.591506 | orchestrator | 2026-04-20 00:43:46.591521 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591536 | orchestrator | Monday 20 April 2026 00:43:42 +0000 (0:00:00.628) 0:00:29.376 ********** 2026-04-20 00:43:46.591549 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1) 2026-04-20 00:43:46.591564 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1) 2026-04-20 00:43:46.591580 | orchestrator | 2026-04-20 00:43:46.591595 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591609 | orchestrator | Monday 20 April 2026 00:43:42 +0000 (0:00:00.364) 0:00:29.740 ********** 2026-04-20 00:43:46.591619 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4) 2026-04-20 00:43:46.591629 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4) 2026-04-20 00:43:46.591639 | orchestrator | 2026-04-20 00:43:46.591648 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:43:46.591656 | orchestrator | Monday 20 April 2026 00:43:42 +0000 (0:00:00.375) 0:00:30.116 ********** 2026-04-20 00:43:46.591665 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-20 00:43:46.591674 | orchestrator | 2026-04-20 00:43:46.591682 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.591726 | orchestrator | Monday 20 April 2026 00:43:43 +0000 (0:00:00.313) 0:00:30.430 ********** 2026-04-20 00:43:46.591745 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2026-04-20 00:43:46.591760 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2026-04-20 00:43:46.591772 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2026-04-20 00:43:46.591782 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2026-04-20 00:43:46.591801 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2026-04-20 00:43:46.591810 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2026-04-20 00:43:46.591818 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2026-04-20 00:43:46.591827 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2026-04-20 00:43:46.591836 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2026-04-20 00:43:46.591844 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2026-04-20 00:43:46.591852 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2026-04-20 00:43:46.591861 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2026-04-20 00:43:46.591885 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2026-04-20 00:43:46.591894 | orchestrator | 2026-04-20 00:43:46.591903 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.591911 | orchestrator | Monday 20 April 2026 00:43:43 +0000 (0:00:00.341) 0:00:30.772 ********** 2026-04-20 00:43:46.591923 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591932 | orchestrator | 2026-04-20 00:43:46.591941 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.591950 | orchestrator | Monday 20 April 2026 00:43:43 +0000 (0:00:00.177) 0:00:30.950 ********** 2026-04-20 00:43:46.591958 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.591966 | orchestrator | 2026-04-20 00:43:46.591975 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.591983 | orchestrator | Monday 20 April 2026 00:43:43 +0000 (0:00:00.172) 0:00:31.122 ********** 2026-04-20 00:43:46.591992 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592000 | orchestrator | 2026-04-20 00:43:46.592009 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592017 | orchestrator | Monday 20 April 2026 00:43:44 +0000 (0:00:00.177) 0:00:31.300 ********** 2026-04-20 00:43:46.592026 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592034 | orchestrator | 2026-04-20 00:43:46.592043 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592051 | orchestrator | Monday 20 April 2026 00:43:44 +0000 (0:00:00.172) 0:00:31.472 ********** 2026-04-20 00:43:46.592060 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592068 | orchestrator | 2026-04-20 00:43:46.592077 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592085 | orchestrator | Monday 20 April 2026 00:43:44 +0000 (0:00:00.173) 0:00:31.646 ********** 2026-04-20 00:43:46.592094 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592102 | orchestrator | 2026-04-20 00:43:46.592110 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592119 | orchestrator | Monday 20 April 2026 00:43:44 +0000 (0:00:00.441) 0:00:32.087 ********** 2026-04-20 00:43:46.592127 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592213 | orchestrator | 2026-04-20 00:43:46.592232 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592246 | orchestrator | Monday 20 April 2026 00:43:45 +0000 (0:00:00.177) 0:00:32.265 ********** 2026-04-20 00:43:46.592260 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592273 | orchestrator | 2026-04-20 00:43:46.592287 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592302 | orchestrator | Monday 20 April 2026 00:43:45 +0000 (0:00:00.161) 0:00:32.426 ********** 2026-04-20 00:43:46.592316 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2026-04-20 00:43:46.592330 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2026-04-20 00:43:46.592357 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2026-04-20 00:43:46.592372 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2026-04-20 00:43:46.592388 | orchestrator | 2026-04-20 00:43:46.592402 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592416 | orchestrator | Monday 20 April 2026 00:43:45 +0000 (0:00:00.576) 0:00:33.002 ********** 2026-04-20 00:43:46.592431 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592444 | orchestrator | 2026-04-20 00:43:46.592456 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592468 | orchestrator | Monday 20 April 2026 00:43:46 +0000 (0:00:00.185) 0:00:33.187 ********** 2026-04-20 00:43:46.592480 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592494 | orchestrator | 2026-04-20 00:43:46.592508 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592523 | orchestrator | Monday 20 April 2026 00:43:46 +0000 (0:00:00.218) 0:00:33.406 ********** 2026-04-20 00:43:46.592539 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592548 | orchestrator | 2026-04-20 00:43:46.592557 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:43:46.592566 | orchestrator | Monday 20 April 2026 00:43:46 +0000 (0:00:00.178) 0:00:33.584 ********** 2026-04-20 00:43:46.592574 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:46.592583 | orchestrator | 2026-04-20 00:43:46.592603 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2026-04-20 00:43:50.297507 | orchestrator | Monday 20 April 2026 00:43:46 +0000 (0:00:00.185) 0:00:33.769 ********** 2026-04-20 00:43:50.297619 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2026-04-20 00:43:50.297632 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2026-04-20 00:43:50.297642 | orchestrator | 2026-04-20 00:43:50.297652 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2026-04-20 00:43:50.297662 | orchestrator | Monday 20 April 2026 00:43:46 +0000 (0:00:00.175) 0:00:33.945 ********** 2026-04-20 00:43:50.297671 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.297679 | orchestrator | 2026-04-20 00:43:50.297688 | orchestrator | TASK [Generate DB VG names] **************************************************** 2026-04-20 00:43:50.297697 | orchestrator | Monday 20 April 2026 00:43:46 +0000 (0:00:00.114) 0:00:34.060 ********** 2026-04-20 00:43:50.297706 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.297714 | orchestrator | 2026-04-20 00:43:50.297723 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2026-04-20 00:43:50.297731 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.142) 0:00:34.202 ********** 2026-04-20 00:43:50.297740 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.297748 | orchestrator | 2026-04-20 00:43:50.297757 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2026-04-20 00:43:50.297766 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.124) 0:00:34.327 ********** 2026-04-20 00:43:50.297775 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:43:50.297785 | orchestrator | 2026-04-20 00:43:50.297794 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2026-04-20 00:43:50.297802 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.263) 0:00:34.590 ********** 2026-04-20 00:43:50.297812 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f2b53557-bc93-5e7c-9922-524bc90e2f58'}}) 2026-04-20 00:43:50.297822 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}}) 2026-04-20 00:43:50.297830 | orchestrator | 2026-04-20 00:43:50.297839 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2026-04-20 00:43:50.297848 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.147) 0:00:34.738 ********** 2026-04-20 00:43:50.297857 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f2b53557-bc93-5e7c-9922-524bc90e2f58'}})  2026-04-20 00:43:50.297891 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}})  2026-04-20 00:43:50.297901 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.297916 | orchestrator | 2026-04-20 00:43:50.297930 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2026-04-20 00:43:50.297944 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.132) 0:00:34.870 ********** 2026-04-20 00:43:50.297958 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f2b53557-bc93-5e7c-9922-524bc90e2f58'}})  2026-04-20 00:43:50.297973 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}})  2026-04-20 00:43:50.297987 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.298001 | orchestrator | 2026-04-20 00:43:50.298190 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2026-04-20 00:43:50.298210 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.133) 0:00:35.004 ********** 2026-04-20 00:43:50.298220 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f2b53557-bc93-5e7c-9922-524bc90e2f58'}})  2026-04-20 00:43:50.298248 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}})  2026-04-20 00:43:50.298285 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.298324 | orchestrator | 2026-04-20 00:43:50.298354 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2026-04-20 00:43:50.298370 | orchestrator | Monday 20 April 2026 00:43:47 +0000 (0:00:00.129) 0:00:35.133 ********** 2026-04-20 00:43:50.298400 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:43:50.298415 | orchestrator | 2026-04-20 00:43:50.298444 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2026-04-20 00:43:50.298458 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.123) 0:00:35.257 ********** 2026-04-20 00:43:50.298471 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:43:50.298500 | orchestrator | 2026-04-20 00:43:50.298513 | orchestrator | TASK [Set DB devices config data] ********************************************** 2026-04-20 00:43:50.298543 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.132) 0:00:35.389 ********** 2026-04-20 00:43:50.298556 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.298588 | orchestrator | 2026-04-20 00:43:50.298747 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2026-04-20 00:43:50.298776 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.120) 0:00:35.509 ********** 2026-04-20 00:43:50.298796 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.298805 | orchestrator | 2026-04-20 00:43:50.298814 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2026-04-20 00:43:50.298823 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.120) 0:00:35.630 ********** 2026-04-20 00:43:50.298843 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.298852 | orchestrator | 2026-04-20 00:43:50.298861 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2026-04-20 00:43:50.298870 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.125) 0:00:35.756 ********** 2026-04-20 00:43:50.298878 | orchestrator | ok: [testbed-node-5] => { 2026-04-20 00:43:50.298900 | orchestrator |  "ceph_osd_devices": { 2026-04-20 00:43:50.298909 | orchestrator |  "sdb": { 2026-04-20 00:43:50.298953 | orchestrator |  "osd_lvm_uuid": "f2b53557-bc93-5e7c-9922-524bc90e2f58" 2026-04-20 00:43:50.298964 | orchestrator |  }, 2026-04-20 00:43:50.298973 | orchestrator |  "sdc": { 2026-04-20 00:43:50.298994 | orchestrator |  "osd_lvm_uuid": "575cdf11-a3b3-50b3-a6b0-c04d40287ec6" 2026-04-20 00:43:50.299003 | orchestrator |  } 2026-04-20 00:43:50.299012 | orchestrator |  } 2026-04-20 00:43:50.299021 | orchestrator | } 2026-04-20 00:43:50.299030 | orchestrator | 2026-04-20 00:43:50.299039 | orchestrator | TASK [Print WAL devices] ******************************************************* 2026-04-20 00:43:50.299075 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.129) 0:00:35.886 ********** 2026-04-20 00:43:50.299084 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.299106 | orchestrator | 2026-04-20 00:43:50.299115 | orchestrator | TASK [Print DB devices] ******************************************************** 2026-04-20 00:43:50.299144 | orchestrator | Monday 20 April 2026 00:43:48 +0000 (0:00:00.119) 0:00:36.005 ********** 2026-04-20 00:43:50.299191 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.299200 | orchestrator | 2026-04-20 00:43:50.299222 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2026-04-20 00:43:50.299231 | orchestrator | Monday 20 April 2026 00:43:49 +0000 (0:00:00.263) 0:00:36.269 ********** 2026-04-20 00:43:50.299252 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:43:50.299261 | orchestrator | 2026-04-20 00:43:50.299270 | orchestrator | TASK [Print configuration data] ************************************************ 2026-04-20 00:43:50.299279 | orchestrator | Monday 20 April 2026 00:43:49 +0000 (0:00:00.130) 0:00:36.399 ********** 2026-04-20 00:43:50.299288 | orchestrator | changed: [testbed-node-5] => { 2026-04-20 00:43:50.299310 | orchestrator |  "_ceph_configure_lvm_config_data": { 2026-04-20 00:43:50.299319 | orchestrator |  "ceph_osd_devices": { 2026-04-20 00:43:50.299328 | orchestrator |  "sdb": { 2026-04-20 00:43:50.299349 | orchestrator |  "osd_lvm_uuid": "f2b53557-bc93-5e7c-9922-524bc90e2f58" 2026-04-20 00:43:50.299358 | orchestrator |  }, 2026-04-20 00:43:50.299367 | orchestrator |  "sdc": { 2026-04-20 00:43:50.299387 | orchestrator |  "osd_lvm_uuid": "575cdf11-a3b3-50b3-a6b0-c04d40287ec6" 2026-04-20 00:43:50.299405 | orchestrator |  } 2026-04-20 00:43:50.299426 | orchestrator |  }, 2026-04-20 00:43:50.299435 | orchestrator |  "lvm_volumes": [ 2026-04-20 00:43:50.299444 | orchestrator |  { 2026-04-20 00:43:50.299464 | orchestrator |  "data": "osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58", 2026-04-20 00:43:50.299474 | orchestrator |  "data_vg": "ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58" 2026-04-20 00:43:50.299483 | orchestrator |  }, 2026-04-20 00:43:50.299504 | orchestrator |  { 2026-04-20 00:43:50.299518 | orchestrator |  "data": "osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6", 2026-04-20 00:43:50.299527 | orchestrator |  "data_vg": "ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6" 2026-04-20 00:43:50.299547 | orchestrator |  } 2026-04-20 00:43:50.299557 | orchestrator |  ] 2026-04-20 00:43:50.299565 | orchestrator |  } 2026-04-20 00:43:50.299591 | orchestrator | } 2026-04-20 00:43:50.299611 | orchestrator | 2026-04-20 00:43:50.299647 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2026-04-20 00:43:50.299677 | orchestrator | Monday 20 April 2026 00:43:49 +0000 (0:00:00.186) 0:00:36.586 ********** 2026-04-20 00:43:50.299692 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-04-20 00:43:50.299723 | orchestrator | 2026-04-20 00:43:50.299738 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:43:50.299772 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-04-20 00:43:50.299819 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-04-20 00:43:50.299833 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2026-04-20 00:43:50.299864 | orchestrator | 2026-04-20 00:43:50.299877 | orchestrator | 2026-04-20 00:43:50.299910 | orchestrator | 2026-04-20 00:43:50.299942 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:43:50.299952 | orchestrator | Monday 20 April 2026 00:43:50 +0000 (0:00:00.882) 0:00:37.468 ********** 2026-04-20 00:43:50.299961 | orchestrator | =============================================================================== 2026-04-20 00:43:50.299993 | orchestrator | Write configuration file ------------------------------------------------ 3.92s 2026-04-20 00:43:50.300002 | orchestrator | Add known partitions to the list of available block devices ------------- 1.08s 2026-04-20 00:43:50.300024 | orchestrator | Add known links to the list of available block devices ------------------ 1.06s 2026-04-20 00:43:50.300033 | orchestrator | Add known partitions to the list of available block devices ------------- 0.92s 2026-04-20 00:43:50.300042 | orchestrator | Get initial list of available block devices ----------------------------- 0.89s 2026-04-20 00:43:50.300061 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.88s 2026-04-20 00:43:50.300070 | orchestrator | Add known links to the list of available block devices ------------------ 0.74s 2026-04-20 00:43:50.300079 | orchestrator | Add known links to the list of available block devices ------------------ 0.63s 2026-04-20 00:43:50.300099 | orchestrator | Add known links to the list of available block devices ------------------ 0.62s 2026-04-20 00:43:50.300108 | orchestrator | Add known links to the list of available block devices ------------------ 0.61s 2026-04-20 00:43:50.300116 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.61s 2026-04-20 00:43:50.300125 | orchestrator | Add known partitions to the list of available block devices ------------- 0.59s 2026-04-20 00:43:50.300181 | orchestrator | Add known partitions to the list of available block devices ------------- 0.58s 2026-04-20 00:43:50.300204 | orchestrator | Generate lvm_volumes structure (block + db + wal) ----------------------- 0.57s 2026-04-20 00:43:50.491509 | orchestrator | Print configuration data ------------------------------------------------ 0.54s 2026-04-20 00:43:50.491635 | orchestrator | Add known links to the list of available block devices ------------------ 0.52s 2026-04-20 00:43:50.491657 | orchestrator | Add known links to the list of available block devices ------------------ 0.52s 2026-04-20 00:43:50.491676 | orchestrator | Print DB devices -------------------------------------------------------- 0.51s 2026-04-20 00:43:50.491695 | orchestrator | Define lvm_volumes structures ------------------------------------------- 0.50s 2026-04-20 00:43:50.491714 | orchestrator | Set WAL devices config data --------------------------------------------- 0.50s 2026-04-20 00:44:11.916600 | orchestrator | 2026-04-20 00:44:11 | INFO  | Task 43cfa694-f8e8-40cb-af78-bcf3cb999e5d (sync inventory) is running in background. Output coming soon. 2026-04-20 00:44:38.696784 | orchestrator | 2026-04-20 00:44:13 | INFO  | Starting group_vars file reorganization 2026-04-20 00:44:38.696885 | orchestrator | 2026-04-20 00:44:13 | INFO  | Moved 0 file(s) to their respective directories 2026-04-20 00:44:38.696895 | orchestrator | 2026-04-20 00:44:13 | INFO  | Group_vars file reorganization completed 2026-04-20 00:44:38.696900 | orchestrator | 2026-04-20 00:44:15 | INFO  | Starting variable preparation from inventory 2026-04-20 00:44:38.696905 | orchestrator | 2026-04-20 00:44:17 | INFO  | Writing 050-kolla-ceph-rgw-hosts.yml with ceph_rgw_hosts 2026-04-20 00:44:38.696910 | orchestrator | 2026-04-20 00:44:17 | INFO  | Writing 050-infrastructure-cephclient-mons.yml with cephclient_mons 2026-04-20 00:44:38.696915 | orchestrator | 2026-04-20 00:44:17 | INFO  | Writing 050-ceph-cluster-fsid.yml with ceph_cluster_fsid 2026-04-20 00:44:38.696920 | orchestrator | 2026-04-20 00:44:17 | INFO  | 3 file(s) written, 6 host(s) processed 2026-04-20 00:44:38.696924 | orchestrator | 2026-04-20 00:44:17 | INFO  | Variable preparation completed 2026-04-20 00:44:38.696929 | orchestrator | 2026-04-20 00:44:19 | INFO  | Starting inventory overwrite handling 2026-04-20 00:44:38.696933 | orchestrator | 2026-04-20 00:44:19 | INFO  | Handling group overwrites in 99-overwrite 2026-04-20 00:44:38.696937 | orchestrator | 2026-04-20 00:44:19 | INFO  | Removing group frr:children from 60-generic 2026-04-20 00:44:38.696941 | orchestrator | 2026-04-20 00:44:19 | INFO  | Removing group netbird:children from 50-infrastructure 2026-04-20 00:44:38.696966 | orchestrator | 2026-04-20 00:44:19 | INFO  | Removing group ceph-mds from 50-ceph 2026-04-20 00:44:38.696971 | orchestrator | 2026-04-20 00:44:19 | INFO  | Removing group ceph-rgw from 50-ceph 2026-04-20 00:44:38.696975 | orchestrator | 2026-04-20 00:44:19 | INFO  | Handling group overwrites in 20-roles 2026-04-20 00:44:38.696993 | orchestrator | 2026-04-20 00:44:19 | INFO  | Removing group k3s_node from 50-infrastructure 2026-04-20 00:44:38.696997 | orchestrator | 2026-04-20 00:44:19 | INFO  | Removed 5 group(s) in total 2026-04-20 00:44:38.697001 | orchestrator | 2026-04-20 00:44:19 | INFO  | Inventory overwrite handling completed 2026-04-20 00:44:38.697005 | orchestrator | 2026-04-20 00:44:20 | INFO  | Starting merge of inventory files 2026-04-20 00:44:38.697009 | orchestrator | 2026-04-20 00:44:20 | INFO  | Inventory files merged successfully 2026-04-20 00:44:38.697012 | orchestrator | 2026-04-20 00:44:24 | INFO  | Generating minified hosts file 2026-04-20 00:44:38.697016 | orchestrator | 2026-04-20 00:44:25 | INFO  | Successfully wrote minified hosts file to /inventory.merge/hosts-minified.yml 2026-04-20 00:44:38.697021 | orchestrator | 2026-04-20 00:44:25 | INFO  | Successfully wrote fast inventory to /inventory.merge/fast/hosts.json 2026-04-20 00:44:38.697025 | orchestrator | 2026-04-20 00:44:27 | INFO  | Generating ClusterShell configuration from Ansible inventory 2026-04-20 00:44:38.697029 | orchestrator | 2026-04-20 00:44:37 | INFO  | Successfully wrote ClusterShell configuration 2026-04-20 00:44:38.697033 | orchestrator | [master 0243c88] 2026-04-20-00-44 2026-04-20 00:44:38.697038 | orchestrator | 5 files changed, 75 insertions(+), 10 deletions(-) 2026-04-20 00:44:38.697043 | orchestrator | create mode 100644 fast/host_vars/testbed-node-3/ceph-lvm-configuration.yml 2026-04-20 00:44:38.697047 | orchestrator | create mode 100644 fast/host_vars/testbed-node-4/ceph-lvm-configuration.yml 2026-04-20 00:44:38.697051 | orchestrator | create mode 100644 fast/host_vars/testbed-node-5/ceph-lvm-configuration.yml 2026-04-20 00:44:40.102967 | orchestrator | 2026-04-20 00:44:40 | INFO  | Prepare task for execution of ceph-create-lvm-devices. 2026-04-20 00:44:40.164657 | orchestrator | 2026-04-20 00:44:40 | INFO  | Task a9f821d5-15e9-4405-8e48-b548256d55ba (ceph-create-lvm-devices) was prepared for execution. 2026-04-20 00:44:40.164811 | orchestrator | 2026-04-20 00:44:40 | INFO  | It takes a moment until task a9f821d5-15e9-4405-8e48-b548256d55ba (ceph-create-lvm-devices) has been started and output is visible here. 2026-04-20 00:44:50.740630 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-20 00:44:50.740693 | orchestrator | 2.16.14 2026-04-20 00:44:50.740702 | orchestrator | 2026-04-20 00:44:50.740710 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-04-20 00:44:50.740717 | orchestrator | 2026-04-20 00:44:50.740724 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-20 00:44:50.740731 | orchestrator | Monday 20 April 2026 00:44:44 +0000 (0:00:00.240) 0:00:00.240 ********** 2026-04-20 00:44:50.740738 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2026-04-20 00:44:50.740744 | orchestrator | 2026-04-20 00:44:50.740750 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-20 00:44:50.740756 | orchestrator | Monday 20 April 2026 00:44:44 +0000 (0:00:00.214) 0:00:00.455 ********** 2026-04-20 00:44:50.740763 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:44:50.740769 | orchestrator | 2026-04-20 00:44:50.740776 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.740782 | orchestrator | Monday 20 April 2026 00:44:44 +0000 (0:00:00.183) 0:00:00.639 ********** 2026-04-20 00:44:50.740788 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2026-04-20 00:44:50.740826 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2026-04-20 00:44:50.740833 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2026-04-20 00:44:50.740838 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2026-04-20 00:44:50.740845 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2026-04-20 00:44:50.740851 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2026-04-20 00:44:50.740862 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2026-04-20 00:44:50.740869 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2026-04-20 00:44:50.740875 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2026-04-20 00:44:50.740881 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2026-04-20 00:44:50.740887 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2026-04-20 00:44:50.740893 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2026-04-20 00:44:50.740899 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2026-04-20 00:44:50.740904 | orchestrator | 2026-04-20 00:44:50.740910 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.740916 | orchestrator | Monday 20 April 2026 00:44:44 +0000 (0:00:00.349) 0:00:00.988 ********** 2026-04-20 00:44:50.740922 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.740928 | orchestrator | 2026-04-20 00:44:50.740934 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.740940 | orchestrator | Monday 20 April 2026 00:44:45 +0000 (0:00:00.359) 0:00:01.348 ********** 2026-04-20 00:44:50.740946 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.740952 | orchestrator | 2026-04-20 00:44:50.740958 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.740965 | orchestrator | Monday 20 April 2026 00:44:45 +0000 (0:00:00.173) 0:00:01.521 ********** 2026-04-20 00:44:50.740971 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.740977 | orchestrator | 2026-04-20 00:44:50.740983 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.740989 | orchestrator | Monday 20 April 2026 00:44:45 +0000 (0:00:00.172) 0:00:01.694 ********** 2026-04-20 00:44:50.740995 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741002 | orchestrator | 2026-04-20 00:44:50.741008 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741014 | orchestrator | Monday 20 April 2026 00:44:45 +0000 (0:00:00.177) 0:00:01.872 ********** 2026-04-20 00:44:50.741020 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741026 | orchestrator | 2026-04-20 00:44:50.741032 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741038 | orchestrator | Monday 20 April 2026 00:44:46 +0000 (0:00:00.172) 0:00:02.045 ********** 2026-04-20 00:44:50.741044 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741051 | orchestrator | 2026-04-20 00:44:50.741057 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741063 | orchestrator | Monday 20 April 2026 00:44:46 +0000 (0:00:00.199) 0:00:02.244 ********** 2026-04-20 00:44:50.741070 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741076 | orchestrator | 2026-04-20 00:44:50.741082 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741089 | orchestrator | Monday 20 April 2026 00:44:46 +0000 (0:00:00.187) 0:00:02.432 ********** 2026-04-20 00:44:50.741095 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741101 | orchestrator | 2026-04-20 00:44:50.741107 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741117 | orchestrator | Monday 20 April 2026 00:44:46 +0000 (0:00:00.208) 0:00:02.641 ********** 2026-04-20 00:44:50.741123 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e) 2026-04-20 00:44:50.741130 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e) 2026-04-20 00:44:50.741136 | orchestrator | 2026-04-20 00:44:50.741143 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741158 | orchestrator | Monday 20 April 2026 00:44:47 +0000 (0:00:00.411) 0:00:03.052 ********** 2026-04-20 00:44:50.741164 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11) 2026-04-20 00:44:50.741171 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11) 2026-04-20 00:44:50.741177 | orchestrator | 2026-04-20 00:44:50.741183 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741189 | orchestrator | Monday 20 April 2026 00:44:47 +0000 (0:00:00.390) 0:00:03.442 ********** 2026-04-20 00:44:50.741195 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11) 2026-04-20 00:44:50.741201 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11) 2026-04-20 00:44:50.741207 | orchestrator | 2026-04-20 00:44:50.741214 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741220 | orchestrator | Monday 20 April 2026 00:44:47 +0000 (0:00:00.547) 0:00:03.990 ********** 2026-04-20 00:44:50.741227 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39) 2026-04-20 00:44:50.741233 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39) 2026-04-20 00:44:50.741239 | orchestrator | 2026-04-20 00:44:50.741246 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:44:50.741252 | orchestrator | Monday 20 April 2026 00:44:48 +0000 (0:00:00.550) 0:00:04.540 ********** 2026-04-20 00:44:50.741259 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-20 00:44:50.741265 | orchestrator | 2026-04-20 00:44:50.741272 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741279 | orchestrator | Monday 20 April 2026 00:44:49 +0000 (0:00:00.577) 0:00:05.118 ********** 2026-04-20 00:44:50.741296 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2026-04-20 00:44:50.741309 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2026-04-20 00:44:50.741315 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2026-04-20 00:44:50.741322 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2026-04-20 00:44:50.741340 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2026-04-20 00:44:50.741347 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2026-04-20 00:44:50.741353 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2026-04-20 00:44:50.741365 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2026-04-20 00:44:50.741371 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2026-04-20 00:44:50.741377 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2026-04-20 00:44:50.741384 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2026-04-20 00:44:50.741390 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2026-04-20 00:44:50.741401 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2026-04-20 00:44:50.741408 | orchestrator | 2026-04-20 00:44:50.741414 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741421 | orchestrator | Monday 20 April 2026 00:44:49 +0000 (0:00:00.372) 0:00:05.490 ********** 2026-04-20 00:44:50.741427 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741434 | orchestrator | 2026-04-20 00:44:50.741441 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741447 | orchestrator | Monday 20 April 2026 00:44:49 +0000 (0:00:00.180) 0:00:05.670 ********** 2026-04-20 00:44:50.741453 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741460 | orchestrator | 2026-04-20 00:44:50.741467 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741473 | orchestrator | Monday 20 April 2026 00:44:49 +0000 (0:00:00.178) 0:00:05.849 ********** 2026-04-20 00:44:50.741480 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741486 | orchestrator | 2026-04-20 00:44:50.741493 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741500 | orchestrator | Monday 20 April 2026 00:44:50 +0000 (0:00:00.190) 0:00:06.040 ********** 2026-04-20 00:44:50.741506 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741513 | orchestrator | 2026-04-20 00:44:50.741520 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741526 | orchestrator | Monday 20 April 2026 00:44:50 +0000 (0:00:00.175) 0:00:06.216 ********** 2026-04-20 00:44:50.741533 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741540 | orchestrator | 2026-04-20 00:44:50.741546 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741553 | orchestrator | Monday 20 April 2026 00:44:50 +0000 (0:00:00.165) 0:00:06.381 ********** 2026-04-20 00:44:50.741560 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741566 | orchestrator | 2026-04-20 00:44:50.741572 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:50.741578 | orchestrator | Monday 20 April 2026 00:44:50 +0000 (0:00:00.183) 0:00:06.565 ********** 2026-04-20 00:44:50.741584 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:50.741590 | orchestrator | 2026-04-20 00:44:50.741601 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:57.894917 | orchestrator | Monday 20 April 2026 00:44:50 +0000 (0:00:00.163) 0:00:06.728 ********** 2026-04-20 00:44:57.895016 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895026 | orchestrator | 2026-04-20 00:44:57.895034 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:57.895040 | orchestrator | Monday 20 April 2026 00:44:50 +0000 (0:00:00.166) 0:00:06.895 ********** 2026-04-20 00:44:57.895047 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2026-04-20 00:44:57.895054 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2026-04-20 00:44:57.895060 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2026-04-20 00:44:57.895066 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2026-04-20 00:44:57.895072 | orchestrator | 2026-04-20 00:44:57.895078 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:57.895084 | orchestrator | Monday 20 April 2026 00:44:51 +0000 (0:00:00.831) 0:00:07.727 ********** 2026-04-20 00:44:57.895092 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895098 | orchestrator | 2026-04-20 00:44:57.895104 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:57.895111 | orchestrator | Monday 20 April 2026 00:44:51 +0000 (0:00:00.178) 0:00:07.906 ********** 2026-04-20 00:44:57.895116 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895123 | orchestrator | 2026-04-20 00:44:57.895129 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:57.895136 | orchestrator | Monday 20 April 2026 00:44:52 +0000 (0:00:00.180) 0:00:08.086 ********** 2026-04-20 00:44:57.895170 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895177 | orchestrator | 2026-04-20 00:44:57.895183 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:44:57.895190 | orchestrator | Monday 20 April 2026 00:44:52 +0000 (0:00:00.167) 0:00:08.254 ********** 2026-04-20 00:44:57.895196 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895202 | orchestrator | 2026-04-20 00:44:57.895221 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-04-20 00:44:57.895227 | orchestrator | Monday 20 April 2026 00:44:52 +0000 (0:00:00.171) 0:00:08.425 ********** 2026-04-20 00:44:57.895233 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895240 | orchestrator | 2026-04-20 00:44:57.895246 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-04-20 00:44:57.895252 | orchestrator | Monday 20 April 2026 00:44:52 +0000 (0:00:00.117) 0:00:08.543 ********** 2026-04-20 00:44:57.895260 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '4264b90b-a777-529d-80cd-078215cd7b61'}}) 2026-04-20 00:44:57.895267 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '0c7195b4-6e55-5dce-81dc-250aafa1626c'}}) 2026-04-20 00:44:57.895273 | orchestrator | 2026-04-20 00:44:57.895280 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-04-20 00:44:57.895286 | orchestrator | Monday 20 April 2026 00:44:52 +0000 (0:00:00.142) 0:00:08.685 ********** 2026-04-20 00:44:57.895294 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'}) 2026-04-20 00:44:57.895302 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'}) 2026-04-20 00:44:57.895309 | orchestrator | 2026-04-20 00:44:57.895315 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-04-20 00:44:57.895323 | orchestrator | Monday 20 April 2026 00:44:54 +0000 (0:00:01.955) 0:00:10.641 ********** 2026-04-20 00:44:57.895329 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895337 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895343 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895401 | orchestrator | 2026-04-20 00:44:57.895409 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-04-20 00:44:57.895416 | orchestrator | Monday 20 April 2026 00:44:54 +0000 (0:00:00.129) 0:00:10.770 ********** 2026-04-20 00:44:57.895422 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'}) 2026-04-20 00:44:57.895429 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'}) 2026-04-20 00:44:57.895435 | orchestrator | 2026-04-20 00:44:57.895442 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-04-20 00:44:57.895449 | orchestrator | Monday 20 April 2026 00:44:56 +0000 (0:00:01.411) 0:00:12.182 ********** 2026-04-20 00:44:57.895455 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895462 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895469 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895477 | orchestrator | 2026-04-20 00:44:57.895484 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-04-20 00:44:57.895501 | orchestrator | Monday 20 April 2026 00:44:56 +0000 (0:00:00.137) 0:00:12.319 ********** 2026-04-20 00:44:57.895525 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895534 | orchestrator | 2026-04-20 00:44:57.895541 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-04-20 00:44:57.895547 | orchestrator | Monday 20 April 2026 00:44:56 +0000 (0:00:00.123) 0:00:12.443 ********** 2026-04-20 00:44:57.895553 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895560 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895566 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895571 | orchestrator | 2026-04-20 00:44:57.895577 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-04-20 00:44:57.895583 | orchestrator | Monday 20 April 2026 00:44:56 +0000 (0:00:00.258) 0:00:12.701 ********** 2026-04-20 00:44:57.895589 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895594 | orchestrator | 2026-04-20 00:44:57.895600 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-04-20 00:44:57.895606 | orchestrator | Monday 20 April 2026 00:44:56 +0000 (0:00:00.120) 0:00:12.822 ********** 2026-04-20 00:44:57.895612 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895617 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895623 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895628 | orchestrator | 2026-04-20 00:44:57.895634 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-04-20 00:44:57.895640 | orchestrator | Monday 20 April 2026 00:44:56 +0000 (0:00:00.124) 0:00:12.946 ********** 2026-04-20 00:44:57.895646 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895652 | orchestrator | 2026-04-20 00:44:57.895658 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-04-20 00:44:57.895664 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.127) 0:00:13.073 ********** 2026-04-20 00:44:57.895670 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895676 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895681 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895687 | orchestrator | 2026-04-20 00:44:57.895693 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-04-20 00:44:57.895699 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.133) 0:00:13.206 ********** 2026-04-20 00:44:57.895706 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:44:57.895713 | orchestrator | 2026-04-20 00:44:57.895719 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-04-20 00:44:57.895725 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.127) 0:00:13.334 ********** 2026-04-20 00:44:57.895732 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895739 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895745 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895752 | orchestrator | 2026-04-20 00:44:57.895759 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-04-20 00:44:57.895782 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.132) 0:00:13.467 ********** 2026-04-20 00:44:57.895789 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895796 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895802 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895808 | orchestrator | 2026-04-20 00:44:57.895815 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-04-20 00:44:57.895821 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.142) 0:00:13.609 ********** 2026-04-20 00:44:57.895828 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:44:57.895834 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:44:57.895841 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895848 | orchestrator | 2026-04-20 00:44:57.895854 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-04-20 00:44:57.895860 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.140) 0:00:13.750 ********** 2026-04-20 00:44:57.895867 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:44:57.895873 | orchestrator | 2026-04-20 00:44:57.895880 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-04-20 00:44:57.895891 | orchestrator | Monday 20 April 2026 00:44:57 +0000 (0:00:00.133) 0:00:13.883 ********** 2026-04-20 00:45:03.582493 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.582592 | orchestrator | 2026-04-20 00:45:03.582604 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-04-20 00:45:03.582611 | orchestrator | Monday 20 April 2026 00:44:58 +0000 (0:00:00.118) 0:00:14.002 ********** 2026-04-20 00:45:03.582618 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.582624 | orchestrator | 2026-04-20 00:45:03.582630 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-04-20 00:45:03.582636 | orchestrator | Monday 20 April 2026 00:44:58 +0000 (0:00:00.122) 0:00:14.124 ********** 2026-04-20 00:45:03.582642 | orchestrator | ok: [testbed-node-3] => { 2026-04-20 00:45:03.582649 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-04-20 00:45:03.582656 | orchestrator | } 2026-04-20 00:45:03.582662 | orchestrator | 2026-04-20 00:45:03.582667 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-04-20 00:45:03.582673 | orchestrator | Monday 20 April 2026 00:44:58 +0000 (0:00:00.262) 0:00:14.386 ********** 2026-04-20 00:45:03.582679 | orchestrator | ok: [testbed-node-3] => { 2026-04-20 00:45:03.582684 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-04-20 00:45:03.582690 | orchestrator | } 2026-04-20 00:45:03.582696 | orchestrator | 2026-04-20 00:45:03.582701 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-04-20 00:45:03.582706 | orchestrator | Monday 20 April 2026 00:44:58 +0000 (0:00:00.122) 0:00:14.509 ********** 2026-04-20 00:45:03.582712 | orchestrator | ok: [testbed-node-3] => { 2026-04-20 00:45:03.582718 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-04-20 00:45:03.582724 | orchestrator | } 2026-04-20 00:45:03.582730 | orchestrator | 2026-04-20 00:45:03.582735 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-04-20 00:45:03.582741 | orchestrator | Monday 20 April 2026 00:44:58 +0000 (0:00:00.131) 0:00:14.641 ********** 2026-04-20 00:45:03.582747 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:03.582752 | orchestrator | 2026-04-20 00:45:03.582775 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-04-20 00:45:03.582782 | orchestrator | Monday 20 April 2026 00:44:59 +0000 (0:00:00.624) 0:00:15.266 ********** 2026-04-20 00:45:03.582788 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:03.582816 | orchestrator | 2026-04-20 00:45:03.582823 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-04-20 00:45:03.582830 | orchestrator | Monday 20 April 2026 00:44:59 +0000 (0:00:00.543) 0:00:15.809 ********** 2026-04-20 00:45:03.582836 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:03.582841 | orchestrator | 2026-04-20 00:45:03.582847 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-04-20 00:45:03.582853 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.478) 0:00:16.288 ********** 2026-04-20 00:45:03.582859 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:03.582864 | orchestrator | 2026-04-20 00:45:03.582870 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-04-20 00:45:03.582876 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.133) 0:00:16.422 ********** 2026-04-20 00:45:03.582882 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.582887 | orchestrator | 2026-04-20 00:45:03.582893 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-04-20 00:45:03.582899 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.087) 0:00:16.510 ********** 2026-04-20 00:45:03.582904 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.582910 | orchestrator | 2026-04-20 00:45:03.582916 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-04-20 00:45:03.582922 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.103) 0:00:16.613 ********** 2026-04-20 00:45:03.582927 | orchestrator | ok: [testbed-node-3] => { 2026-04-20 00:45:03.582933 | orchestrator |  "vgs_report": { 2026-04-20 00:45:03.582940 | orchestrator |  "vg": [] 2026-04-20 00:45:03.582948 | orchestrator |  } 2026-04-20 00:45:03.582959 | orchestrator | } 2026-04-20 00:45:03.582969 | orchestrator | 2026-04-20 00:45:03.582978 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-04-20 00:45:03.582988 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.129) 0:00:16.742 ********** 2026-04-20 00:45:03.582997 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583005 | orchestrator | 2026-04-20 00:45:03.583014 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-04-20 00:45:03.583023 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.123) 0:00:16.866 ********** 2026-04-20 00:45:03.583031 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583042 | orchestrator | 2026-04-20 00:45:03.583052 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-04-20 00:45:03.583063 | orchestrator | Monday 20 April 2026 00:45:00 +0000 (0:00:00.126) 0:00:16.992 ********** 2026-04-20 00:45:03.583072 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583082 | orchestrator | 2026-04-20 00:45:03.583092 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-04-20 00:45:03.583103 | orchestrator | Monday 20 April 2026 00:45:01 +0000 (0:00:00.133) 0:00:17.125 ********** 2026-04-20 00:45:03.583110 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583119 | orchestrator | 2026-04-20 00:45:03.583128 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-04-20 00:45:03.583137 | orchestrator | Monday 20 April 2026 00:45:01 +0000 (0:00:00.277) 0:00:17.403 ********** 2026-04-20 00:45:03.583148 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583159 | orchestrator | 2026-04-20 00:45:03.583168 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-04-20 00:45:03.583177 | orchestrator | Monday 20 April 2026 00:45:01 +0000 (0:00:00.124) 0:00:17.527 ********** 2026-04-20 00:45:03.583188 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583198 | orchestrator | 2026-04-20 00:45:03.583207 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-04-20 00:45:03.583215 | orchestrator | Monday 20 April 2026 00:45:01 +0000 (0:00:00.127) 0:00:17.655 ********** 2026-04-20 00:45:03.583224 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583231 | orchestrator | 2026-04-20 00:45:03.583240 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-04-20 00:45:03.583254 | orchestrator | Monday 20 April 2026 00:45:01 +0000 (0:00:00.126) 0:00:17.781 ********** 2026-04-20 00:45:03.583283 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583292 | orchestrator | 2026-04-20 00:45:03.583301 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-04-20 00:45:03.583312 | orchestrator | Monday 20 April 2026 00:45:01 +0000 (0:00:00.125) 0:00:17.906 ********** 2026-04-20 00:45:03.583319 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583325 | orchestrator | 2026-04-20 00:45:03.583332 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-04-20 00:45:03.583339 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.123) 0:00:18.030 ********** 2026-04-20 00:45:03.583345 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583351 | orchestrator | 2026-04-20 00:45:03.583357 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-04-20 00:45:03.583385 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.125) 0:00:18.156 ********** 2026-04-20 00:45:03.583392 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583398 | orchestrator | 2026-04-20 00:45:03.583404 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-04-20 00:45:03.583409 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.105) 0:00:18.262 ********** 2026-04-20 00:45:03.583415 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583421 | orchestrator | 2026-04-20 00:45:03.583427 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-04-20 00:45:03.583433 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.123) 0:00:18.385 ********** 2026-04-20 00:45:03.583439 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583445 | orchestrator | 2026-04-20 00:45:03.583451 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-04-20 00:45:03.583457 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.118) 0:00:18.503 ********** 2026-04-20 00:45:03.583463 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583469 | orchestrator | 2026-04-20 00:45:03.583475 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-04-20 00:45:03.583481 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.119) 0:00:18.623 ********** 2026-04-20 00:45:03.583489 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:03.583499 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:03.583506 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583513 | orchestrator | 2026-04-20 00:45:03.583519 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-04-20 00:45:03.583525 | orchestrator | Monday 20 April 2026 00:45:02 +0000 (0:00:00.141) 0:00:18.764 ********** 2026-04-20 00:45:03.583530 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:03.583537 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:03.583543 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583548 | orchestrator | 2026-04-20 00:45:03.583554 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-04-20 00:45:03.583560 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.306) 0:00:19.070 ********** 2026-04-20 00:45:03.583566 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:03.583571 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:03.583584 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583589 | orchestrator | 2026-04-20 00:45:03.583595 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-04-20 00:45:03.583602 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.139) 0:00:19.210 ********** 2026-04-20 00:45:03.583607 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:03.583614 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:03.583620 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583625 | orchestrator | 2026-04-20 00:45:03.583630 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-04-20 00:45:03.583635 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.137) 0:00:19.348 ********** 2026-04-20 00:45:03.583641 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:03.583647 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:03.583653 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:03.583659 | orchestrator | 2026-04-20 00:45:03.583664 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-04-20 00:45:03.583669 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.168) 0:00:19.516 ********** 2026-04-20 00:45:03.583681 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:08.292761 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:08.292856 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:08.292867 | orchestrator | 2026-04-20 00:45:08.292876 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-04-20 00:45:08.292884 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.133) 0:00:19.649 ********** 2026-04-20 00:45:08.292909 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:08.292918 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:08.292922 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:08.292926 | orchestrator | 2026-04-20 00:45:08.292931 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-04-20 00:45:08.292936 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.168) 0:00:19.818 ********** 2026-04-20 00:45:08.292942 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:08.292952 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:08.292959 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:08.292965 | orchestrator | 2026-04-20 00:45:08.292971 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-04-20 00:45:08.292977 | orchestrator | Monday 20 April 2026 00:45:03 +0000 (0:00:00.132) 0:00:19.951 ********** 2026-04-20 00:45:08.292983 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:08.292991 | orchestrator | 2026-04-20 00:45:08.292997 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-04-20 00:45:08.293022 | orchestrator | Monday 20 April 2026 00:45:04 +0000 (0:00:00.528) 0:00:20.479 ********** 2026-04-20 00:45:08.293027 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:08.293030 | orchestrator | 2026-04-20 00:45:08.293034 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-04-20 00:45:08.293038 | orchestrator | Monday 20 April 2026 00:45:05 +0000 (0:00:00.552) 0:00:21.031 ********** 2026-04-20 00:45:08.293042 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:45:08.293047 | orchestrator | 2026-04-20 00:45:08.293053 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-04-20 00:45:08.293059 | orchestrator | Monday 20 April 2026 00:45:05 +0000 (0:00:00.135) 0:00:21.166 ********** 2026-04-20 00:45:08.293065 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'vg_name': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'}) 2026-04-20 00:45:08.293073 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'vg_name': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'}) 2026-04-20 00:45:08.293079 | orchestrator | 2026-04-20 00:45:08.293085 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-04-20 00:45:08.293092 | orchestrator | Monday 20 April 2026 00:45:05 +0000 (0:00:00.157) 0:00:21.324 ********** 2026-04-20 00:45:08.293098 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:08.293104 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:08.293111 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:08.293116 | orchestrator | 2026-04-20 00:45:08.293119 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-04-20 00:45:08.293123 | orchestrator | Monday 20 April 2026 00:45:05 +0000 (0:00:00.156) 0:00:21.481 ********** 2026-04-20 00:45:08.293127 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:08.293131 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:08.293135 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:08.293139 | orchestrator | 2026-04-20 00:45:08.293143 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-04-20 00:45:08.293146 | orchestrator | Monday 20 April 2026 00:45:05 +0000 (0:00:00.251) 0:00:21.732 ********** 2026-04-20 00:45:08.293150 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'})  2026-04-20 00:45:08.293154 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'})  2026-04-20 00:45:08.293158 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:45:08.293162 | orchestrator | 2026-04-20 00:45:08.293166 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-04-20 00:45:08.293170 | orchestrator | Monday 20 April 2026 00:45:05 +0000 (0:00:00.146) 0:00:21.879 ********** 2026-04-20 00:45:08.293187 | orchestrator | ok: [testbed-node-3] => { 2026-04-20 00:45:08.293191 | orchestrator |  "lvm_report": { 2026-04-20 00:45:08.293195 | orchestrator |  "lv": [ 2026-04-20 00:45:08.293199 | orchestrator |  { 2026-04-20 00:45:08.293203 | orchestrator |  "lv_name": "osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c", 2026-04-20 00:45:08.293208 | orchestrator |  "vg_name": "ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c" 2026-04-20 00:45:08.293212 | orchestrator |  }, 2026-04-20 00:45:08.293216 | orchestrator |  { 2026-04-20 00:45:08.293224 | orchestrator |  "lv_name": "osd-block-4264b90b-a777-529d-80cd-078215cd7b61", 2026-04-20 00:45:08.293228 | orchestrator |  "vg_name": "ceph-4264b90b-a777-529d-80cd-078215cd7b61" 2026-04-20 00:45:08.293232 | orchestrator |  } 2026-04-20 00:45:08.293235 | orchestrator |  ], 2026-04-20 00:45:08.293239 | orchestrator |  "pv": [ 2026-04-20 00:45:08.293243 | orchestrator |  { 2026-04-20 00:45:08.293247 | orchestrator |  "pv_name": "/dev/sdb", 2026-04-20 00:45:08.293250 | orchestrator |  "vg_name": "ceph-4264b90b-a777-529d-80cd-078215cd7b61" 2026-04-20 00:45:08.293254 | orchestrator |  }, 2026-04-20 00:45:08.293258 | orchestrator |  { 2026-04-20 00:45:08.293262 | orchestrator |  "pv_name": "/dev/sdc", 2026-04-20 00:45:08.293266 | orchestrator |  "vg_name": "ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c" 2026-04-20 00:45:08.293269 | orchestrator |  } 2026-04-20 00:45:08.293273 | orchestrator |  ] 2026-04-20 00:45:08.293277 | orchestrator |  } 2026-04-20 00:45:08.293281 | orchestrator | } 2026-04-20 00:45:08.293285 | orchestrator | 2026-04-20 00:45:08.293288 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-04-20 00:45:08.293292 | orchestrator | 2026-04-20 00:45:08.293296 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-20 00:45:08.293303 | orchestrator | Monday 20 April 2026 00:45:06 +0000 (0:00:00.247) 0:00:22.127 ********** 2026-04-20 00:45:08.293307 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2026-04-20 00:45:08.293311 | orchestrator | 2026-04-20 00:45:08.293315 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-20 00:45:08.293319 | orchestrator | Monday 20 April 2026 00:45:06 +0000 (0:00:00.244) 0:00:22.371 ********** 2026-04-20 00:45:08.293323 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:08.293327 | orchestrator | 2026-04-20 00:45:08.293330 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293334 | orchestrator | Monday 20 April 2026 00:45:06 +0000 (0:00:00.216) 0:00:22.587 ********** 2026-04-20 00:45:08.293338 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2026-04-20 00:45:08.293342 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2026-04-20 00:45:08.293346 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2026-04-20 00:45:08.293349 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2026-04-20 00:45:08.293353 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2026-04-20 00:45:08.293357 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2026-04-20 00:45:08.293361 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2026-04-20 00:45:08.293364 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2026-04-20 00:45:08.293368 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2026-04-20 00:45:08.293372 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2026-04-20 00:45:08.293424 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2026-04-20 00:45:08.293429 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2026-04-20 00:45:08.293433 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2026-04-20 00:45:08.293436 | orchestrator | 2026-04-20 00:45:08.293440 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293444 | orchestrator | Monday 20 April 2026 00:45:06 +0000 (0:00:00.361) 0:00:22.949 ********** 2026-04-20 00:45:08.293448 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:08.293451 | orchestrator | 2026-04-20 00:45:08.293459 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293463 | orchestrator | Monday 20 April 2026 00:45:07 +0000 (0:00:00.179) 0:00:23.129 ********** 2026-04-20 00:45:08.293466 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:08.293470 | orchestrator | 2026-04-20 00:45:08.293474 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293478 | orchestrator | Monday 20 April 2026 00:45:07 +0000 (0:00:00.177) 0:00:23.307 ********** 2026-04-20 00:45:08.293481 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:08.293485 | orchestrator | 2026-04-20 00:45:08.293489 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293493 | orchestrator | Monday 20 April 2026 00:45:07 +0000 (0:00:00.164) 0:00:23.471 ********** 2026-04-20 00:45:08.293497 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:08.293500 | orchestrator | 2026-04-20 00:45:08.293504 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293508 | orchestrator | Monday 20 April 2026 00:45:07 +0000 (0:00:00.444) 0:00:23.915 ********** 2026-04-20 00:45:08.293512 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:08.293515 | orchestrator | 2026-04-20 00:45:08.293519 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:08.293523 | orchestrator | Monday 20 April 2026 00:45:08 +0000 (0:00:00.192) 0:00:24.108 ********** 2026-04-20 00:45:08.293527 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:08.293531 | orchestrator | 2026-04-20 00:45:08.293538 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740373 | orchestrator | Monday 20 April 2026 00:45:08 +0000 (0:00:00.174) 0:00:24.282 ********** 2026-04-20 00:45:18.740565 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.740595 | orchestrator | 2026-04-20 00:45:18.740609 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740620 | orchestrator | Monday 20 April 2026 00:45:08 +0000 (0:00:00.207) 0:00:24.490 ********** 2026-04-20 00:45:18.740632 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.740643 | orchestrator | 2026-04-20 00:45:18.740654 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740665 | orchestrator | Monday 20 April 2026 00:45:08 +0000 (0:00:00.216) 0:00:24.706 ********** 2026-04-20 00:45:18.740676 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501) 2026-04-20 00:45:18.740688 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501) 2026-04-20 00:45:18.740698 | orchestrator | 2026-04-20 00:45:18.740709 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740720 | orchestrator | Monday 20 April 2026 00:45:09 +0000 (0:00:00.415) 0:00:25.122 ********** 2026-04-20 00:45:18.740731 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c) 2026-04-20 00:45:18.740742 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c) 2026-04-20 00:45:18.740753 | orchestrator | 2026-04-20 00:45:18.740764 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740775 | orchestrator | Monday 20 April 2026 00:45:09 +0000 (0:00:00.404) 0:00:25.526 ********** 2026-04-20 00:45:18.740786 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e) 2026-04-20 00:45:18.740797 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e) 2026-04-20 00:45:18.740808 | orchestrator | 2026-04-20 00:45:18.740819 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740830 | orchestrator | Monday 20 April 2026 00:45:09 +0000 (0:00:00.454) 0:00:25.981 ********** 2026-04-20 00:45:18.740840 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435) 2026-04-20 00:45:18.740876 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435) 2026-04-20 00:45:18.740888 | orchestrator | 2026-04-20 00:45:18.740899 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:18.740912 | orchestrator | Monday 20 April 2026 00:45:10 +0000 (0:00:00.415) 0:00:26.396 ********** 2026-04-20 00:45:18.740924 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-20 00:45:18.740940 | orchestrator | 2026-04-20 00:45:18.740960 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.740977 | orchestrator | Monday 20 April 2026 00:45:10 +0000 (0:00:00.328) 0:00:26.724 ********** 2026-04-20 00:45:18.740995 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2026-04-20 00:45:18.741016 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2026-04-20 00:45:18.741037 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2026-04-20 00:45:18.741057 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2026-04-20 00:45:18.741077 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2026-04-20 00:45:18.741090 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2026-04-20 00:45:18.741102 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2026-04-20 00:45:18.741115 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2026-04-20 00:45:18.741128 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2026-04-20 00:45:18.741140 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2026-04-20 00:45:18.741151 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2026-04-20 00:45:18.741161 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2026-04-20 00:45:18.741172 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2026-04-20 00:45:18.741183 | orchestrator | 2026-04-20 00:45:18.741194 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741220 | orchestrator | Monday 20 April 2026 00:45:11 +0000 (0:00:00.761) 0:00:27.486 ********** 2026-04-20 00:45:18.741231 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741242 | orchestrator | 2026-04-20 00:45:18.741253 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741264 | orchestrator | Monday 20 April 2026 00:45:11 +0000 (0:00:00.235) 0:00:27.722 ********** 2026-04-20 00:45:18.741274 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741285 | orchestrator | 2026-04-20 00:45:18.741296 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741307 | orchestrator | Monday 20 April 2026 00:45:11 +0000 (0:00:00.215) 0:00:27.937 ********** 2026-04-20 00:45:18.741317 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741328 | orchestrator | 2026-04-20 00:45:18.741359 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741370 | orchestrator | Monday 20 April 2026 00:45:12 +0000 (0:00:00.224) 0:00:28.161 ********** 2026-04-20 00:45:18.741381 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741392 | orchestrator | 2026-04-20 00:45:18.741430 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741450 | orchestrator | Monday 20 April 2026 00:45:12 +0000 (0:00:00.204) 0:00:28.366 ********** 2026-04-20 00:45:18.741468 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741487 | orchestrator | 2026-04-20 00:45:18.741505 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741538 | orchestrator | Monday 20 April 2026 00:45:12 +0000 (0:00:00.200) 0:00:28.567 ********** 2026-04-20 00:45:18.741558 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741577 | orchestrator | 2026-04-20 00:45:18.741595 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741609 | orchestrator | Monday 20 April 2026 00:45:12 +0000 (0:00:00.206) 0:00:28.773 ********** 2026-04-20 00:45:18.741621 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741632 | orchestrator | 2026-04-20 00:45:18.741642 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741653 | orchestrator | Monday 20 April 2026 00:45:12 +0000 (0:00:00.205) 0:00:28.979 ********** 2026-04-20 00:45:18.741664 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741674 | orchestrator | 2026-04-20 00:45:18.741685 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741695 | orchestrator | Monday 20 April 2026 00:45:13 +0000 (0:00:00.204) 0:00:29.183 ********** 2026-04-20 00:45:18.741713 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2026-04-20 00:45:18.741724 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2026-04-20 00:45:18.741735 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2026-04-20 00:45:18.741745 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2026-04-20 00:45:18.741756 | orchestrator | 2026-04-20 00:45:18.741767 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741777 | orchestrator | Monday 20 April 2026 00:45:14 +0000 (0:00:00.858) 0:00:30.042 ********** 2026-04-20 00:45:18.741788 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741799 | orchestrator | 2026-04-20 00:45:18.741809 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741820 | orchestrator | Monday 20 April 2026 00:45:14 +0000 (0:00:00.227) 0:00:30.270 ********** 2026-04-20 00:45:18.741831 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741841 | orchestrator | 2026-04-20 00:45:18.741852 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741863 | orchestrator | Monday 20 April 2026 00:45:14 +0000 (0:00:00.198) 0:00:30.468 ********** 2026-04-20 00:45:18.741873 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741884 | orchestrator | 2026-04-20 00:45:18.741894 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:18.741906 | orchestrator | Monday 20 April 2026 00:45:15 +0000 (0:00:00.667) 0:00:31.135 ********** 2026-04-20 00:45:18.741916 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741927 | orchestrator | 2026-04-20 00:45:18.741937 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-04-20 00:45:18.741948 | orchestrator | Monday 20 April 2026 00:45:15 +0000 (0:00:00.203) 0:00:31.339 ********** 2026-04-20 00:45:18.741958 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.741969 | orchestrator | 2026-04-20 00:45:18.741980 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-04-20 00:45:18.741997 | orchestrator | Monday 20 April 2026 00:45:15 +0000 (0:00:00.155) 0:00:31.495 ********** 2026-04-20 00:45:18.742079 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '7b8b741f-ff85-57a0-9457-c04aa474e6a9'}}) 2026-04-20 00:45:18.742102 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a3c07e85-95b7-5759-bf4d-00aad97d3561'}}) 2026-04-20 00:45:18.742113 | orchestrator | 2026-04-20 00:45:18.742124 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-04-20 00:45:18.742135 | orchestrator | Monday 20 April 2026 00:45:15 +0000 (0:00:00.199) 0:00:31.694 ********** 2026-04-20 00:45:18.742147 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'}) 2026-04-20 00:45:18.742159 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'}) 2026-04-20 00:45:18.742177 | orchestrator | 2026-04-20 00:45:18.742189 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-04-20 00:45:18.742199 | orchestrator | Monday 20 April 2026 00:45:17 +0000 (0:00:01.728) 0:00:33.422 ********** 2026-04-20 00:45:18.742210 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:18.742222 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:18.742233 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:18.742243 | orchestrator | 2026-04-20 00:45:18.742254 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-04-20 00:45:18.742265 | orchestrator | Monday 20 April 2026 00:45:17 +0000 (0:00:00.141) 0:00:33.564 ********** 2026-04-20 00:45:18.742276 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'}) 2026-04-20 00:45:18.742299 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'}) 2026-04-20 00:45:24.120128 | orchestrator | 2026-04-20 00:45:24.120225 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-04-20 00:45:24.120236 | orchestrator | Monday 20 April 2026 00:45:18 +0000 (0:00:01.250) 0:00:34.814 ********** 2026-04-20 00:45:24.120245 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120254 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120261 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120268 | orchestrator | 2026-04-20 00:45:24.120275 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-04-20 00:45:24.120281 | orchestrator | Monday 20 April 2026 00:45:18 +0000 (0:00:00.166) 0:00:34.981 ********** 2026-04-20 00:45:24.120287 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120294 | orchestrator | 2026-04-20 00:45:24.120300 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-04-20 00:45:24.120307 | orchestrator | Monday 20 April 2026 00:45:19 +0000 (0:00:00.136) 0:00:35.117 ********** 2026-04-20 00:45:24.120328 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120336 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120342 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120349 | orchestrator | 2026-04-20 00:45:24.120355 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-04-20 00:45:24.120362 | orchestrator | Monday 20 April 2026 00:45:19 +0000 (0:00:00.167) 0:00:35.285 ********** 2026-04-20 00:45:24.120368 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120373 | orchestrator | 2026-04-20 00:45:24.120379 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-04-20 00:45:24.120385 | orchestrator | Monday 20 April 2026 00:45:19 +0000 (0:00:00.124) 0:00:35.409 ********** 2026-04-20 00:45:24.120390 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120397 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120403 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120479 | orchestrator | 2026-04-20 00:45:24.120486 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-04-20 00:45:24.120493 | orchestrator | Monday 20 April 2026 00:45:19 +0000 (0:00:00.144) 0:00:35.553 ********** 2026-04-20 00:45:24.120500 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120506 | orchestrator | 2026-04-20 00:45:24.120513 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-04-20 00:45:24.120520 | orchestrator | Monday 20 April 2026 00:45:19 +0000 (0:00:00.285) 0:00:35.839 ********** 2026-04-20 00:45:24.120526 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120534 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120540 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120547 | orchestrator | 2026-04-20 00:45:24.120553 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-04-20 00:45:24.120560 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.168) 0:00:36.008 ********** 2026-04-20 00:45:24.120568 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:24.120577 | orchestrator | 2026-04-20 00:45:24.120583 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-04-20 00:45:24.120589 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.136) 0:00:36.144 ********** 2026-04-20 00:45:24.120596 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120602 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120607 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120613 | orchestrator | 2026-04-20 00:45:24.120620 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-04-20 00:45:24.120626 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.138) 0:00:36.283 ********** 2026-04-20 00:45:24.120632 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120638 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120645 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120651 | orchestrator | 2026-04-20 00:45:24.120657 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-04-20 00:45:24.120680 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.140) 0:00:36.423 ********** 2026-04-20 00:45:24.120686 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:24.120692 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:24.120698 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120704 | orchestrator | 2026-04-20 00:45:24.120709 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-04-20 00:45:24.120715 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.147) 0:00:36.570 ********** 2026-04-20 00:45:24.120721 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120727 | orchestrator | 2026-04-20 00:45:24.120733 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-04-20 00:45:24.120739 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.131) 0:00:36.702 ********** 2026-04-20 00:45:24.120744 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120758 | orchestrator | 2026-04-20 00:45:24.120764 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-04-20 00:45:24.120771 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.123) 0:00:36.826 ********** 2026-04-20 00:45:24.120777 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.120784 | orchestrator | 2026-04-20 00:45:24.120791 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-04-20 00:45:24.120797 | orchestrator | Monday 20 April 2026 00:45:20 +0000 (0:00:00.133) 0:00:36.960 ********** 2026-04-20 00:45:24.120803 | orchestrator | ok: [testbed-node-4] => { 2026-04-20 00:45:24.120809 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-04-20 00:45:24.120816 | orchestrator | } 2026-04-20 00:45:24.120822 | orchestrator | 2026-04-20 00:45:24.120828 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-04-20 00:45:24.120894 | orchestrator | Monday 20 April 2026 00:45:21 +0000 (0:00:00.133) 0:00:37.093 ********** 2026-04-20 00:45:24.120903 | orchestrator | ok: [testbed-node-4] => { 2026-04-20 00:45:24.120910 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-04-20 00:45:24.120917 | orchestrator | } 2026-04-20 00:45:24.120923 | orchestrator | 2026-04-20 00:45:24.120930 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-04-20 00:45:24.120937 | orchestrator | Monday 20 April 2026 00:45:21 +0000 (0:00:00.181) 0:00:37.274 ********** 2026-04-20 00:45:24.120943 | orchestrator | ok: [testbed-node-4] => { 2026-04-20 00:45:24.120950 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-04-20 00:45:24.120958 | orchestrator | } 2026-04-20 00:45:24.120965 | orchestrator | 2026-04-20 00:45:24.120972 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-04-20 00:45:24.120979 | orchestrator | Monday 20 April 2026 00:45:21 +0000 (0:00:00.117) 0:00:37.392 ********** 2026-04-20 00:45:24.120986 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:24.120993 | orchestrator | 2026-04-20 00:45:24.121013 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-04-20 00:45:24.121021 | orchestrator | Monday 20 April 2026 00:45:22 +0000 (0:00:00.668) 0:00:38.061 ********** 2026-04-20 00:45:24.121028 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:24.121042 | orchestrator | 2026-04-20 00:45:24.121048 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-04-20 00:45:24.121055 | orchestrator | Monday 20 April 2026 00:45:22 +0000 (0:00:00.536) 0:00:38.598 ********** 2026-04-20 00:45:24.121061 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:24.121068 | orchestrator | 2026-04-20 00:45:24.121075 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-04-20 00:45:24.121082 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.505) 0:00:39.103 ********** 2026-04-20 00:45:24.121088 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:24.121094 | orchestrator | 2026-04-20 00:45:24.121100 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-04-20 00:45:24.121106 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.128) 0:00:39.231 ********** 2026-04-20 00:45:24.121113 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.121118 | orchestrator | 2026-04-20 00:45:24.121125 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-04-20 00:45:24.121131 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.096) 0:00:39.328 ********** 2026-04-20 00:45:24.121138 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.121145 | orchestrator | 2026-04-20 00:45:24.121151 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-04-20 00:45:24.121158 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.095) 0:00:39.424 ********** 2026-04-20 00:45:24.121165 | orchestrator | ok: [testbed-node-4] => { 2026-04-20 00:45:24.121172 | orchestrator |  "vgs_report": { 2026-04-20 00:45:24.121179 | orchestrator |  "vg": [] 2026-04-20 00:45:24.121185 | orchestrator |  } 2026-04-20 00:45:24.121192 | orchestrator | } 2026-04-20 00:45:24.121198 | orchestrator | 2026-04-20 00:45:24.121205 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-04-20 00:45:24.121222 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.142) 0:00:39.566 ********** 2026-04-20 00:45:24.121230 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.121236 | orchestrator | 2026-04-20 00:45:24.121243 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-04-20 00:45:24.121250 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.127) 0:00:39.693 ********** 2026-04-20 00:45:24.121256 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.121261 | orchestrator | 2026-04-20 00:45:24.121268 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-04-20 00:45:24.121275 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.136) 0:00:39.830 ********** 2026-04-20 00:45:24.121281 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.121288 | orchestrator | 2026-04-20 00:45:24.121305 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-04-20 00:45:24.121313 | orchestrator | Monday 20 April 2026 00:45:23 +0000 (0:00:00.124) 0:00:39.954 ********** 2026-04-20 00:45:24.121319 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:24.121325 | orchestrator | 2026-04-20 00:45:24.121341 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-04-20 00:45:28.437127 | orchestrator | Monday 20 April 2026 00:45:24 +0000 (0:00:00.153) 0:00:40.108 ********** 2026-04-20 00:45:28.437227 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437246 | orchestrator | 2026-04-20 00:45:28.437262 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-04-20 00:45:28.437277 | orchestrator | Monday 20 April 2026 00:45:24 +0000 (0:00:00.131) 0:00:40.240 ********** 2026-04-20 00:45:28.437292 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437306 | orchestrator | 2026-04-20 00:45:28.437320 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-04-20 00:45:28.437335 | orchestrator | Monday 20 April 2026 00:45:24 +0000 (0:00:00.279) 0:00:40.519 ********** 2026-04-20 00:45:28.437350 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437364 | orchestrator | 2026-04-20 00:45:28.437378 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-04-20 00:45:28.437393 | orchestrator | Monday 20 April 2026 00:45:24 +0000 (0:00:00.132) 0:00:40.651 ********** 2026-04-20 00:45:28.437408 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437502 | orchestrator | 2026-04-20 00:45:28.437520 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-04-20 00:45:28.437535 | orchestrator | Monday 20 April 2026 00:45:24 +0000 (0:00:00.134) 0:00:40.786 ********** 2026-04-20 00:45:28.437551 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437567 | orchestrator | 2026-04-20 00:45:28.437598 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-04-20 00:45:28.437616 | orchestrator | Monday 20 April 2026 00:45:24 +0000 (0:00:00.123) 0:00:40.909 ********** 2026-04-20 00:45:28.437631 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437648 | orchestrator | 2026-04-20 00:45:28.437664 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-04-20 00:45:28.437680 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.179) 0:00:41.088 ********** 2026-04-20 00:45:28.437697 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437713 | orchestrator | 2026-04-20 00:45:28.437729 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-04-20 00:45:28.437746 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.132) 0:00:41.221 ********** 2026-04-20 00:45:28.437761 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437777 | orchestrator | 2026-04-20 00:45:28.437792 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-04-20 00:45:28.437808 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.121) 0:00:41.343 ********** 2026-04-20 00:45:28.437824 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437840 | orchestrator | 2026-04-20 00:45:28.437881 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-04-20 00:45:28.437898 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.132) 0:00:41.475 ********** 2026-04-20 00:45:28.437909 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437918 | orchestrator | 2026-04-20 00:45:28.437927 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-04-20 00:45:28.437935 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.137) 0:00:41.613 ********** 2026-04-20 00:45:28.437945 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.437955 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.437964 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.437972 | orchestrator | 2026-04-20 00:45:28.437981 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-04-20 00:45:28.437990 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.149) 0:00:41.763 ********** 2026-04-20 00:45:28.437998 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438007 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438132 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438142 | orchestrator | 2026-04-20 00:45:28.438150 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-04-20 00:45:28.438157 | orchestrator | Monday 20 April 2026 00:45:25 +0000 (0:00:00.140) 0:00:41.904 ********** 2026-04-20 00:45:28.438165 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438173 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438181 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438189 | orchestrator | 2026-04-20 00:45:28.438197 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-04-20 00:45:28.438205 | orchestrator | Monday 20 April 2026 00:45:26 +0000 (0:00:00.153) 0:00:42.057 ********** 2026-04-20 00:45:28.438212 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438220 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438229 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438237 | orchestrator | 2026-04-20 00:45:28.438261 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-04-20 00:45:28.438269 | orchestrator | Monday 20 April 2026 00:45:26 +0000 (0:00:00.318) 0:00:42.376 ********** 2026-04-20 00:45:28.438277 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438285 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438293 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438301 | orchestrator | 2026-04-20 00:45:28.438309 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-04-20 00:45:28.438317 | orchestrator | Monday 20 April 2026 00:45:26 +0000 (0:00:00.161) 0:00:42.537 ********** 2026-04-20 00:45:28.438324 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438345 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438354 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438362 | orchestrator | 2026-04-20 00:45:28.438369 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-04-20 00:45:28.438377 | orchestrator | Monday 20 April 2026 00:45:26 +0000 (0:00:00.139) 0:00:42.677 ********** 2026-04-20 00:45:28.438385 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438393 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438401 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438408 | orchestrator | 2026-04-20 00:45:28.438416 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-04-20 00:45:28.438445 | orchestrator | Monday 20 April 2026 00:45:26 +0000 (0:00:00.134) 0:00:42.811 ********** 2026-04-20 00:45:28.438453 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438461 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438469 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438476 | orchestrator | 2026-04-20 00:45:28.438484 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-04-20 00:45:28.438492 | orchestrator | Monday 20 April 2026 00:45:26 +0000 (0:00:00.141) 0:00:42.953 ********** 2026-04-20 00:45:28.438500 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:28.438508 | orchestrator | 2026-04-20 00:45:28.438515 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-04-20 00:45:28.438523 | orchestrator | Monday 20 April 2026 00:45:27 +0000 (0:00:00.526) 0:00:43.480 ********** 2026-04-20 00:45:28.438531 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:28.438539 | orchestrator | 2026-04-20 00:45:28.438546 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-04-20 00:45:28.438554 | orchestrator | Monday 20 April 2026 00:45:27 +0000 (0:00:00.455) 0:00:43.935 ********** 2026-04-20 00:45:28.438562 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:45:28.438570 | orchestrator | 2026-04-20 00:45:28.438577 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-04-20 00:45:28.438585 | orchestrator | Monday 20 April 2026 00:45:28 +0000 (0:00:00.118) 0:00:44.054 ********** 2026-04-20 00:45:28.438593 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'vg_name': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'}) 2026-04-20 00:45:28.438602 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'vg_name': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'}) 2026-04-20 00:45:28.438610 | orchestrator | 2026-04-20 00:45:28.438618 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-04-20 00:45:28.438625 | orchestrator | Monday 20 April 2026 00:45:28 +0000 (0:00:00.151) 0:00:44.205 ********** 2026-04-20 00:45:28.438633 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438641 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:28.438649 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:28.438657 | orchestrator | 2026-04-20 00:45:28.438670 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-04-20 00:45:28.438677 | orchestrator | Monday 20 April 2026 00:45:28 +0000 (0:00:00.149) 0:00:44.355 ********** 2026-04-20 00:45:28.438685 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:28.438699 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:33.598759 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:33.598855 | orchestrator | 2026-04-20 00:45:33.598866 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-04-20 00:45:33.598875 | orchestrator | Monday 20 April 2026 00:45:28 +0000 (0:00:00.153) 0:00:44.508 ********** 2026-04-20 00:45:33.598883 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'})  2026-04-20 00:45:33.598892 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'})  2026-04-20 00:45:33.598915 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:45:33.598922 | orchestrator | 2026-04-20 00:45:33.598929 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-04-20 00:45:33.598936 | orchestrator | Monday 20 April 2026 00:45:28 +0000 (0:00:00.129) 0:00:44.638 ********** 2026-04-20 00:45:33.598942 | orchestrator | ok: [testbed-node-4] => { 2026-04-20 00:45:33.598948 | orchestrator |  "lvm_report": { 2026-04-20 00:45:33.598956 | orchestrator |  "lv": [ 2026-04-20 00:45:33.598963 | orchestrator |  { 2026-04-20 00:45:33.598983 | orchestrator |  "lv_name": "osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9", 2026-04-20 00:45:33.598991 | orchestrator |  "vg_name": "ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9" 2026-04-20 00:45:33.598998 | orchestrator |  }, 2026-04-20 00:45:33.599004 | orchestrator |  { 2026-04-20 00:45:33.599010 | orchestrator |  "lv_name": "osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561", 2026-04-20 00:45:33.599017 | orchestrator |  "vg_name": "ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561" 2026-04-20 00:45:33.599023 | orchestrator |  } 2026-04-20 00:45:33.599029 | orchestrator |  ], 2026-04-20 00:45:33.599036 | orchestrator |  "pv": [ 2026-04-20 00:45:33.599042 | orchestrator |  { 2026-04-20 00:45:33.599048 | orchestrator |  "pv_name": "/dev/sdb", 2026-04-20 00:45:33.599055 | orchestrator |  "vg_name": "ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9" 2026-04-20 00:45:33.599061 | orchestrator |  }, 2026-04-20 00:45:33.599067 | orchestrator |  { 2026-04-20 00:45:33.599074 | orchestrator |  "pv_name": "/dev/sdc", 2026-04-20 00:45:33.599080 | orchestrator |  "vg_name": "ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561" 2026-04-20 00:45:33.599087 | orchestrator |  } 2026-04-20 00:45:33.599093 | orchestrator |  ] 2026-04-20 00:45:33.599100 | orchestrator |  } 2026-04-20 00:45:33.599106 | orchestrator | } 2026-04-20 00:45:33.599113 | orchestrator | 2026-04-20 00:45:33.599119 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2026-04-20 00:45:33.599126 | orchestrator | 2026-04-20 00:45:33.599132 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2026-04-20 00:45:33.599138 | orchestrator | Monday 20 April 2026 00:45:28 +0000 (0:00:00.346) 0:00:44.985 ********** 2026-04-20 00:45:33.599145 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2026-04-20 00:45:33.599152 | orchestrator | 2026-04-20 00:45:33.599158 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2026-04-20 00:45:33.599164 | orchestrator | Monday 20 April 2026 00:45:29 +0000 (0:00:00.204) 0:00:45.189 ********** 2026-04-20 00:45:33.599171 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:33.599194 | orchestrator | 2026-04-20 00:45:33.599200 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599207 | orchestrator | Monday 20 April 2026 00:45:29 +0000 (0:00:00.227) 0:00:45.416 ********** 2026-04-20 00:45:33.599213 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2026-04-20 00:45:33.599219 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2026-04-20 00:45:33.599226 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2026-04-20 00:45:33.599232 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2026-04-20 00:45:33.599242 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2026-04-20 00:45:33.599248 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2026-04-20 00:45:33.599255 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2026-04-20 00:45:33.599261 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2026-04-20 00:45:33.599267 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2026-04-20 00:45:33.599274 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2026-04-20 00:45:33.599280 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2026-04-20 00:45:33.599286 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2026-04-20 00:45:33.599292 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2026-04-20 00:45:33.599299 | orchestrator | 2026-04-20 00:45:33.599305 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599311 | orchestrator | Monday 20 April 2026 00:45:29 +0000 (0:00:00.364) 0:00:45.780 ********** 2026-04-20 00:45:33.599318 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599324 | orchestrator | 2026-04-20 00:45:33.599330 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599337 | orchestrator | Monday 20 April 2026 00:45:29 +0000 (0:00:00.176) 0:00:45.957 ********** 2026-04-20 00:45:33.599344 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599350 | orchestrator | 2026-04-20 00:45:33.599357 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599377 | orchestrator | Monday 20 April 2026 00:45:30 +0000 (0:00:00.164) 0:00:46.122 ********** 2026-04-20 00:45:33.599384 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599391 | orchestrator | 2026-04-20 00:45:33.599397 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599404 | orchestrator | Monday 20 April 2026 00:45:30 +0000 (0:00:00.162) 0:00:46.284 ********** 2026-04-20 00:45:33.599411 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599417 | orchestrator | 2026-04-20 00:45:33.599446 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599454 | orchestrator | Monday 20 April 2026 00:45:30 +0000 (0:00:00.168) 0:00:46.453 ********** 2026-04-20 00:45:33.599460 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599466 | orchestrator | 2026-04-20 00:45:33.599473 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599479 | orchestrator | Monday 20 April 2026 00:45:30 +0000 (0:00:00.189) 0:00:46.643 ********** 2026-04-20 00:45:33.599486 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599493 | orchestrator | 2026-04-20 00:45:33.599500 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599507 | orchestrator | Monday 20 April 2026 00:45:31 +0000 (0:00:00.483) 0:00:47.126 ********** 2026-04-20 00:45:33.599514 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599523 | orchestrator | 2026-04-20 00:45:33.599540 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599552 | orchestrator | Monday 20 April 2026 00:45:31 +0000 (0:00:00.182) 0:00:47.309 ********** 2026-04-20 00:45:33.599564 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:33.599576 | orchestrator | 2026-04-20 00:45:33.599584 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599591 | orchestrator | Monday 20 April 2026 00:45:31 +0000 (0:00:00.166) 0:00:47.476 ********** 2026-04-20 00:45:33.599597 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203) 2026-04-20 00:45:33.599606 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203) 2026-04-20 00:45:33.599612 | orchestrator | 2026-04-20 00:45:33.599618 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599624 | orchestrator | Monday 20 April 2026 00:45:31 +0000 (0:00:00.341) 0:00:47.817 ********** 2026-04-20 00:45:33.599630 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560) 2026-04-20 00:45:33.599671 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560) 2026-04-20 00:45:33.599679 | orchestrator | 2026-04-20 00:45:33.599685 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599691 | orchestrator | Monday 20 April 2026 00:45:32 +0000 (0:00:00.385) 0:00:48.203 ********** 2026-04-20 00:45:33.599697 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1) 2026-04-20 00:45:33.599703 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1) 2026-04-20 00:45:33.599709 | orchestrator | 2026-04-20 00:45:33.599714 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599720 | orchestrator | Monday 20 April 2026 00:45:32 +0000 (0:00:00.396) 0:00:48.599 ********** 2026-04-20 00:45:33.599726 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4) 2026-04-20 00:45:33.599733 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4) 2026-04-20 00:45:33.599739 | orchestrator | 2026-04-20 00:45:33.599745 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2026-04-20 00:45:33.599751 | orchestrator | Monday 20 April 2026 00:45:33 +0000 (0:00:00.407) 0:00:49.007 ********** 2026-04-20 00:45:33.599758 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2026-04-20 00:45:33.599764 | orchestrator | 2026-04-20 00:45:33.599770 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:33.599777 | orchestrator | Monday 20 April 2026 00:45:33 +0000 (0:00:00.298) 0:00:49.306 ********** 2026-04-20 00:45:33.599783 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2026-04-20 00:45:33.599790 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2026-04-20 00:45:33.599796 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2026-04-20 00:45:33.599802 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2026-04-20 00:45:33.599808 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2026-04-20 00:45:33.599815 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2026-04-20 00:45:33.599821 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2026-04-20 00:45:33.599827 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2026-04-20 00:45:33.599833 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2026-04-20 00:45:33.599846 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2026-04-20 00:45:33.599853 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2026-04-20 00:45:33.599865 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2026-04-20 00:45:41.510802 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2026-04-20 00:45:41.510855 | orchestrator | 2026-04-20 00:45:41.510862 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510866 | orchestrator | Monday 20 April 2026 00:45:33 +0000 (0:00:00.359) 0:00:49.665 ********** 2026-04-20 00:45:41.510871 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510875 | orchestrator | 2026-04-20 00:45:41.510879 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510883 | orchestrator | Monday 20 April 2026 00:45:33 +0000 (0:00:00.192) 0:00:49.857 ********** 2026-04-20 00:45:41.510886 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510890 | orchestrator | 2026-04-20 00:45:41.510894 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510898 | orchestrator | Monday 20 April 2026 00:45:34 +0000 (0:00:00.202) 0:00:50.060 ********** 2026-04-20 00:45:41.510901 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510905 | orchestrator | 2026-04-20 00:45:41.510909 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510920 | orchestrator | Monday 20 April 2026 00:45:34 +0000 (0:00:00.495) 0:00:50.555 ********** 2026-04-20 00:45:41.510924 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510928 | orchestrator | 2026-04-20 00:45:41.510932 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510936 | orchestrator | Monday 20 April 2026 00:45:34 +0000 (0:00:00.174) 0:00:50.730 ********** 2026-04-20 00:45:41.510939 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510943 | orchestrator | 2026-04-20 00:45:41.510947 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510951 | orchestrator | Monday 20 April 2026 00:45:34 +0000 (0:00:00.164) 0:00:50.894 ********** 2026-04-20 00:45:41.510954 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510958 | orchestrator | 2026-04-20 00:45:41.510962 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510965 | orchestrator | Monday 20 April 2026 00:45:35 +0000 (0:00:00.168) 0:00:51.063 ********** 2026-04-20 00:45:41.510969 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510973 | orchestrator | 2026-04-20 00:45:41.510976 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510980 | orchestrator | Monday 20 April 2026 00:45:35 +0000 (0:00:00.183) 0:00:51.246 ********** 2026-04-20 00:45:41.510984 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.510987 | orchestrator | 2026-04-20 00:45:41.510991 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.510995 | orchestrator | Monday 20 April 2026 00:45:35 +0000 (0:00:00.152) 0:00:51.399 ********** 2026-04-20 00:45:41.510999 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2026-04-20 00:45:41.511003 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2026-04-20 00:45:41.511007 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2026-04-20 00:45:41.511011 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2026-04-20 00:45:41.511015 | orchestrator | 2026-04-20 00:45:41.511018 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.511022 | orchestrator | Monday 20 April 2026 00:45:35 +0000 (0:00:00.580) 0:00:51.979 ********** 2026-04-20 00:45:41.511026 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511029 | orchestrator | 2026-04-20 00:45:41.511033 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.511037 | orchestrator | Monday 20 April 2026 00:45:36 +0000 (0:00:00.175) 0:00:52.155 ********** 2026-04-20 00:45:41.511051 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511055 | orchestrator | 2026-04-20 00:45:41.511059 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.511063 | orchestrator | Monday 20 April 2026 00:45:36 +0000 (0:00:00.186) 0:00:52.341 ********** 2026-04-20 00:45:41.511066 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511070 | orchestrator | 2026-04-20 00:45:41.511074 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2026-04-20 00:45:41.511077 | orchestrator | Monday 20 April 2026 00:45:36 +0000 (0:00:00.171) 0:00:52.513 ********** 2026-04-20 00:45:41.511081 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511085 | orchestrator | 2026-04-20 00:45:41.511088 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2026-04-20 00:45:41.511092 | orchestrator | Monday 20 April 2026 00:45:36 +0000 (0:00:00.181) 0:00:52.695 ********** 2026-04-20 00:45:41.511096 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511099 | orchestrator | 2026-04-20 00:45:41.511103 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2026-04-20 00:45:41.511107 | orchestrator | Monday 20 April 2026 00:45:36 +0000 (0:00:00.103) 0:00:52.798 ********** 2026-04-20 00:45:41.511111 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'f2b53557-bc93-5e7c-9922-524bc90e2f58'}}) 2026-04-20 00:45:41.511115 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}}) 2026-04-20 00:45:41.511118 | orchestrator | 2026-04-20 00:45:41.511122 | orchestrator | TASK [Create block VGs] ******************************************************** 2026-04-20 00:45:41.511126 | orchestrator | Monday 20 April 2026 00:45:37 +0000 (0:00:00.309) 0:00:53.108 ********** 2026-04-20 00:45:41.511131 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'}) 2026-04-20 00:45:41.511135 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}) 2026-04-20 00:45:41.511139 | orchestrator | 2026-04-20 00:45:41.511143 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2026-04-20 00:45:41.511153 | orchestrator | Monday 20 April 2026 00:45:38 +0000 (0:00:01.852) 0:00:54.960 ********** 2026-04-20 00:45:41.511158 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:41.511162 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:41.511166 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511169 | orchestrator | 2026-04-20 00:45:41.511173 | orchestrator | TASK [Create block LVs] ******************************************************** 2026-04-20 00:45:41.511177 | orchestrator | Monday 20 April 2026 00:45:39 +0000 (0:00:00.131) 0:00:55.092 ********** 2026-04-20 00:45:41.511181 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'}) 2026-04-20 00:45:41.511186 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}) 2026-04-20 00:45:41.511190 | orchestrator | 2026-04-20 00:45:41.511194 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2026-04-20 00:45:41.511198 | orchestrator | Monday 20 April 2026 00:45:40 +0000 (0:00:01.403) 0:00:56.495 ********** 2026-04-20 00:45:41.511201 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:41.511205 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:41.511212 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511215 | orchestrator | 2026-04-20 00:45:41.511219 | orchestrator | TASK [Create DB VGs] *********************************************************** 2026-04-20 00:45:41.511223 | orchestrator | Monday 20 April 2026 00:45:40 +0000 (0:00:00.134) 0:00:56.630 ********** 2026-04-20 00:45:41.511227 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511230 | orchestrator | 2026-04-20 00:45:41.511234 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2026-04-20 00:45:41.511238 | orchestrator | Monday 20 April 2026 00:45:40 +0000 (0:00:00.109) 0:00:56.739 ********** 2026-04-20 00:45:41.511241 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:41.511245 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:41.511249 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511252 | orchestrator | 2026-04-20 00:45:41.511256 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2026-04-20 00:45:41.511260 | orchestrator | Monday 20 April 2026 00:45:40 +0000 (0:00:00.125) 0:00:56.864 ********** 2026-04-20 00:45:41.511263 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511267 | orchestrator | 2026-04-20 00:45:41.511271 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2026-04-20 00:45:41.511274 | orchestrator | Monday 20 April 2026 00:45:40 +0000 (0:00:00.111) 0:00:56.976 ********** 2026-04-20 00:45:41.511278 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:41.511282 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:41.511286 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511289 | orchestrator | 2026-04-20 00:45:41.511293 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2026-04-20 00:45:41.511297 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.137) 0:00:57.113 ********** 2026-04-20 00:45:41.511301 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511304 | orchestrator | 2026-04-20 00:45:41.511308 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2026-04-20 00:45:41.511312 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.114) 0:00:57.229 ********** 2026-04-20 00:45:41.511315 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:41.511319 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:41.511323 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:41.511326 | orchestrator | 2026-04-20 00:45:41.511330 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2026-04-20 00:45:41.511334 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.124) 0:00:57.353 ********** 2026-04-20 00:45:41.511338 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:41.511341 | orchestrator | 2026-04-20 00:45:41.511345 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2026-04-20 00:45:41.511349 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.101) 0:00:57.454 ********** 2026-04-20 00:45:41.511356 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:47.500979 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:47.501101 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501114 | orchestrator | 2026-04-20 00:45:47.501122 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2026-04-20 00:45:47.501131 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.255) 0:00:57.710 ********** 2026-04-20 00:45:47.501138 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:47.501144 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:47.501151 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501158 | orchestrator | 2026-04-20 00:45:47.501181 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2026-04-20 00:45:47.501188 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.128) 0:00:57.838 ********** 2026-04-20 00:45:47.501195 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:47.501201 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:47.501208 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501214 | orchestrator | 2026-04-20 00:45:47.501221 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2026-04-20 00:45:47.501227 | orchestrator | Monday 20 April 2026 00:45:41 +0000 (0:00:00.118) 0:00:57.956 ********** 2026-04-20 00:45:47.501234 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501241 | orchestrator | 2026-04-20 00:45:47.501247 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2026-04-20 00:45:47.501254 | orchestrator | Monday 20 April 2026 00:45:42 +0000 (0:00:00.112) 0:00:58.068 ********** 2026-04-20 00:45:47.501261 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501267 | orchestrator | 2026-04-20 00:45:47.501274 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2026-04-20 00:45:47.501281 | orchestrator | Monday 20 April 2026 00:45:42 +0000 (0:00:00.098) 0:00:58.167 ********** 2026-04-20 00:45:47.501287 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501294 | orchestrator | 2026-04-20 00:45:47.501301 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2026-04-20 00:45:47.501308 | orchestrator | Monday 20 April 2026 00:45:42 +0000 (0:00:00.122) 0:00:58.290 ********** 2026-04-20 00:45:47.501314 | orchestrator | ok: [testbed-node-5] => { 2026-04-20 00:45:47.501322 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2026-04-20 00:45:47.501329 | orchestrator | } 2026-04-20 00:45:47.501336 | orchestrator | 2026-04-20 00:45:47.501342 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2026-04-20 00:45:47.501349 | orchestrator | Monday 20 April 2026 00:45:42 +0000 (0:00:00.117) 0:00:58.407 ********** 2026-04-20 00:45:47.501355 | orchestrator | ok: [testbed-node-5] => { 2026-04-20 00:45:47.501362 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2026-04-20 00:45:47.501368 | orchestrator | } 2026-04-20 00:45:47.501375 | orchestrator | 2026-04-20 00:45:47.501382 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2026-04-20 00:45:47.501389 | orchestrator | Monday 20 April 2026 00:45:42 +0000 (0:00:00.118) 0:00:58.526 ********** 2026-04-20 00:45:47.501394 | orchestrator | ok: [testbed-node-5] => { 2026-04-20 00:45:47.501400 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2026-04-20 00:45:47.501407 | orchestrator | } 2026-04-20 00:45:47.501413 | orchestrator | 2026-04-20 00:45:47.501420 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2026-04-20 00:45:47.501426 | orchestrator | Monday 20 April 2026 00:45:42 +0000 (0:00:00.112) 0:00:58.638 ********** 2026-04-20 00:45:47.501540 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:47.501551 | orchestrator | 2026-04-20 00:45:47.501558 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2026-04-20 00:45:47.501565 | orchestrator | Monday 20 April 2026 00:45:43 +0000 (0:00:00.456) 0:00:59.095 ********** 2026-04-20 00:45:47.501572 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:47.501578 | orchestrator | 2026-04-20 00:45:47.501584 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2026-04-20 00:45:47.501591 | orchestrator | Monday 20 April 2026 00:45:43 +0000 (0:00:00.478) 0:00:59.574 ********** 2026-04-20 00:45:47.501598 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:47.501605 | orchestrator | 2026-04-20 00:45:47.501612 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2026-04-20 00:45:47.501619 | orchestrator | Monday 20 April 2026 00:45:44 +0000 (0:00:00.474) 0:01:00.049 ********** 2026-04-20 00:45:47.501626 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:47.501633 | orchestrator | 2026-04-20 00:45:47.501639 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2026-04-20 00:45:47.501646 | orchestrator | Monday 20 April 2026 00:45:44 +0000 (0:00:00.390) 0:01:00.439 ********** 2026-04-20 00:45:47.501653 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501660 | orchestrator | 2026-04-20 00:45:47.501667 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2026-04-20 00:45:47.501674 | orchestrator | Monday 20 April 2026 00:45:44 +0000 (0:00:00.117) 0:01:00.557 ********** 2026-04-20 00:45:47.501681 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501687 | orchestrator | 2026-04-20 00:45:47.501694 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2026-04-20 00:45:47.501701 | orchestrator | Monday 20 April 2026 00:45:44 +0000 (0:00:00.102) 0:01:00.660 ********** 2026-04-20 00:45:47.501708 | orchestrator | ok: [testbed-node-5] => { 2026-04-20 00:45:47.501715 | orchestrator |  "vgs_report": { 2026-04-20 00:45:47.501723 | orchestrator |  "vg": [] 2026-04-20 00:45:47.501746 | orchestrator |  } 2026-04-20 00:45:47.501753 | orchestrator | } 2026-04-20 00:45:47.501760 | orchestrator | 2026-04-20 00:45:47.501767 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2026-04-20 00:45:47.501774 | orchestrator | Monday 20 April 2026 00:45:44 +0000 (0:00:00.141) 0:01:00.802 ********** 2026-04-20 00:45:47.501782 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501789 | orchestrator | 2026-04-20 00:45:47.501796 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2026-04-20 00:45:47.501802 | orchestrator | Monday 20 April 2026 00:45:44 +0000 (0:00:00.133) 0:01:00.935 ********** 2026-04-20 00:45:47.501809 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501816 | orchestrator | 2026-04-20 00:45:47.501821 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2026-04-20 00:45:47.501826 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.135) 0:01:01.070 ********** 2026-04-20 00:45:47.501832 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501838 | orchestrator | 2026-04-20 00:45:47.501844 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2026-04-20 00:45:47.501850 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.144) 0:01:01.214 ********** 2026-04-20 00:45:47.501857 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501863 | orchestrator | 2026-04-20 00:45:47.501870 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2026-04-20 00:45:47.501875 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.154) 0:01:01.368 ********** 2026-04-20 00:45:47.501882 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501888 | orchestrator | 2026-04-20 00:45:47.501895 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2026-04-20 00:45:47.501901 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.152) 0:01:01.521 ********** 2026-04-20 00:45:47.501907 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501913 | orchestrator | 2026-04-20 00:45:47.501926 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2026-04-20 00:45:47.501933 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.136) 0:01:01.657 ********** 2026-04-20 00:45:47.501938 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501944 | orchestrator | 2026-04-20 00:45:47.501950 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2026-04-20 00:45:47.501968 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.137) 0:01:01.795 ********** 2026-04-20 00:45:47.501974 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.501980 | orchestrator | 2026-04-20 00:45:47.501986 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2026-04-20 00:45:47.501992 | orchestrator | Monday 20 April 2026 00:45:45 +0000 (0:00:00.144) 0:01:01.940 ********** 2026-04-20 00:45:47.501998 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502005 | orchestrator | 2026-04-20 00:45:47.502011 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2026-04-20 00:45:47.502069 | orchestrator | Monday 20 April 2026 00:45:46 +0000 (0:00:00.439) 0:01:02.379 ********** 2026-04-20 00:45:47.502080 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502087 | orchestrator | 2026-04-20 00:45:47.502094 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2026-04-20 00:45:47.502101 | orchestrator | Monday 20 April 2026 00:45:46 +0000 (0:00:00.154) 0:01:02.533 ********** 2026-04-20 00:45:47.502107 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502113 | orchestrator | 2026-04-20 00:45:47.502119 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2026-04-20 00:45:47.502126 | orchestrator | Monday 20 April 2026 00:45:46 +0000 (0:00:00.142) 0:01:02.676 ********** 2026-04-20 00:45:47.502133 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502139 | orchestrator | 2026-04-20 00:45:47.502146 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2026-04-20 00:45:47.502152 | orchestrator | Monday 20 April 2026 00:45:46 +0000 (0:00:00.134) 0:01:02.810 ********** 2026-04-20 00:45:47.502159 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502165 | orchestrator | 2026-04-20 00:45:47.502172 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2026-04-20 00:45:47.502179 | orchestrator | Monday 20 April 2026 00:45:46 +0000 (0:00:00.138) 0:01:02.948 ********** 2026-04-20 00:45:47.502185 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502192 | orchestrator | 2026-04-20 00:45:47.502199 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2026-04-20 00:45:47.502206 | orchestrator | Monday 20 April 2026 00:45:47 +0000 (0:00:00.127) 0:01:03.076 ********** 2026-04-20 00:45:47.502213 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:47.502220 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:47.502227 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502234 | orchestrator | 2026-04-20 00:45:47.502240 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2026-04-20 00:45:47.502247 | orchestrator | Monday 20 April 2026 00:45:47 +0000 (0:00:00.165) 0:01:03.242 ********** 2026-04-20 00:45:47.502254 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:47.502261 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:47.502268 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:47.502275 | orchestrator | 2026-04-20 00:45:47.502281 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2026-04-20 00:45:47.502289 | orchestrator | Monday 20 April 2026 00:45:47 +0000 (0:00:00.178) 0:01:03.420 ********** 2026-04-20 00:45:47.502311 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725428 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725501 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725508 | orchestrator | 2026-04-20 00:45:50.725512 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2026-04-20 00:45:50.725517 | orchestrator | Monday 20 April 2026 00:45:47 +0000 (0:00:00.160) 0:01:03.581 ********** 2026-04-20 00:45:50.725522 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725533 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725537 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725541 | orchestrator | 2026-04-20 00:45:50.725545 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2026-04-20 00:45:50.725549 | orchestrator | Monday 20 April 2026 00:45:47 +0000 (0:00:00.158) 0:01:03.739 ********** 2026-04-20 00:45:50.725552 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725556 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725560 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725564 | orchestrator | 2026-04-20 00:45:50.725568 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2026-04-20 00:45:50.725572 | orchestrator | Monday 20 April 2026 00:45:47 +0000 (0:00:00.164) 0:01:03.903 ********** 2026-04-20 00:45:50.725576 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725580 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725584 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725590 | orchestrator | 2026-04-20 00:45:50.725600 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2026-04-20 00:45:50.725607 | orchestrator | Monday 20 April 2026 00:45:48 +0000 (0:00:00.141) 0:01:04.044 ********** 2026-04-20 00:45:50.725614 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725619 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725623 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725627 | orchestrator | 2026-04-20 00:45:50.725631 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2026-04-20 00:45:50.725635 | orchestrator | Monday 20 April 2026 00:45:48 +0000 (0:00:00.380) 0:01:04.426 ********** 2026-04-20 00:45:50.725640 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725647 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725656 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725662 | orchestrator | 2026-04-20 00:45:50.725668 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2026-04-20 00:45:50.725685 | orchestrator | Monday 20 April 2026 00:45:48 +0000 (0:00:00.161) 0:01:04.587 ********** 2026-04-20 00:45:50.725691 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:50.725698 | orchestrator | 2026-04-20 00:45:50.725704 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2026-04-20 00:45:50.725710 | orchestrator | Monday 20 April 2026 00:45:49 +0000 (0:00:00.545) 0:01:05.133 ********** 2026-04-20 00:45:50.725716 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:50.725722 | orchestrator | 2026-04-20 00:45:50.725728 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2026-04-20 00:45:50.725733 | orchestrator | Monday 20 April 2026 00:45:49 +0000 (0:00:00.571) 0:01:05.704 ********** 2026-04-20 00:45:50.725739 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:45:50.725745 | orchestrator | 2026-04-20 00:45:50.725751 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2026-04-20 00:45:50.725756 | orchestrator | Monday 20 April 2026 00:45:49 +0000 (0:00:00.139) 0:01:05.844 ********** 2026-04-20 00:45:50.725762 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'vg_name': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}) 2026-04-20 00:45:50.725769 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'vg_name': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'}) 2026-04-20 00:45:50.725775 | orchestrator | 2026-04-20 00:45:50.725781 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2026-04-20 00:45:50.725786 | orchestrator | Monday 20 April 2026 00:45:50 +0000 (0:00:00.172) 0:01:06.017 ********** 2026-04-20 00:45:50.725803 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725811 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725816 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725820 | orchestrator | 2026-04-20 00:45:50.725824 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2026-04-20 00:45:50.725828 | orchestrator | Monday 20 April 2026 00:45:50 +0000 (0:00:00.158) 0:01:06.175 ********** 2026-04-20 00:45:50.725835 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725839 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725843 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725849 | orchestrator | 2026-04-20 00:45:50.725855 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2026-04-20 00:45:50.725866 | orchestrator | Monday 20 April 2026 00:45:50 +0000 (0:00:00.184) 0:01:06.360 ********** 2026-04-20 00:45:50.725872 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'})  2026-04-20 00:45:50.725878 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'})  2026-04-20 00:45:50.725884 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:45:50.725890 | orchestrator | 2026-04-20 00:45:50.725895 | orchestrator | TASK [Print LVM report data] *************************************************** 2026-04-20 00:45:50.725901 | orchestrator | Monday 20 April 2026 00:45:50 +0000 (0:00:00.167) 0:01:06.527 ********** 2026-04-20 00:45:50.725907 | orchestrator | ok: [testbed-node-5] => { 2026-04-20 00:45:50.725913 | orchestrator |  "lvm_report": { 2026-04-20 00:45:50.725919 | orchestrator |  "lv": [ 2026-04-20 00:45:50.725925 | orchestrator |  { 2026-04-20 00:45:50.725938 | orchestrator |  "lv_name": "osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6", 2026-04-20 00:45:50.725945 | orchestrator |  "vg_name": "ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6" 2026-04-20 00:45:50.725952 | orchestrator |  }, 2026-04-20 00:45:50.725958 | orchestrator |  { 2026-04-20 00:45:50.725964 | orchestrator |  "lv_name": "osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58", 2026-04-20 00:45:50.725971 | orchestrator |  "vg_name": "ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58" 2026-04-20 00:45:50.725977 | orchestrator |  } 2026-04-20 00:45:50.725983 | orchestrator |  ], 2026-04-20 00:45:50.725990 | orchestrator |  "pv": [ 2026-04-20 00:45:50.725996 | orchestrator |  { 2026-04-20 00:45:50.726003 | orchestrator |  "pv_name": "/dev/sdb", 2026-04-20 00:45:50.726010 | orchestrator |  "vg_name": "ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58" 2026-04-20 00:45:50.726084 | orchestrator |  }, 2026-04-20 00:45:50.726090 | orchestrator |  { 2026-04-20 00:45:50.726093 | orchestrator |  "pv_name": "/dev/sdc", 2026-04-20 00:45:50.726097 | orchestrator |  "vg_name": "ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6" 2026-04-20 00:45:50.726103 | orchestrator |  } 2026-04-20 00:45:50.726114 | orchestrator |  ] 2026-04-20 00:45:50.726120 | orchestrator |  } 2026-04-20 00:45:50.726130 | orchestrator | } 2026-04-20 00:45:50.726138 | orchestrator | 2026-04-20 00:45:50.726143 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:45:50.726149 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-04-20 00:45:50.726155 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-04-20 00:45:50.726161 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2026-04-20 00:45:50.726167 | orchestrator | 2026-04-20 00:45:50.726172 | orchestrator | 2026-04-20 00:45:50.726178 | orchestrator | 2026-04-20 00:45:50.726184 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:45:50.726190 | orchestrator | Monday 20 April 2026 00:45:50 +0000 (0:00:00.177) 0:01:06.705 ********** 2026-04-20 00:45:50.726200 | orchestrator | =============================================================================== 2026-04-20 00:45:50.726207 | orchestrator | Create block VGs -------------------------------------------------------- 5.54s 2026-04-20 00:45:50.726213 | orchestrator | Create block LVs -------------------------------------------------------- 4.07s 2026-04-20 00:45:50.726219 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 1.75s 2026-04-20 00:45:50.726225 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.60s 2026-04-20 00:45:50.726231 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.58s 2026-04-20 00:45:50.726237 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.56s 2026-04-20 00:45:50.726244 | orchestrator | Add known partitions to the list of available block devices ------------- 1.49s 2026-04-20 00:45:50.726251 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.46s 2026-04-20 00:45:50.726262 | orchestrator | Add known links to the list of available block devices ------------------ 1.07s 2026-04-20 00:45:51.165885 | orchestrator | Add known partitions to the list of available block devices ------------- 0.86s 2026-04-20 00:45:51.165947 | orchestrator | Add known partitions to the list of available block devices ------------- 0.83s 2026-04-20 00:45:51.165955 | orchestrator | Print LVM report data --------------------------------------------------- 0.77s 2026-04-20 00:45:51.165962 | orchestrator | Print size needed for WAL LVs on ceph_db_wal_devices -------------------- 0.69s 2026-04-20 00:45:51.165968 | orchestrator | Create DB LVs for ceph_db_wal_devices ----------------------------------- 0.68s 2026-04-20 00:45:51.165989 | orchestrator | Add known partitions to the list of available block devices ------------- 0.67s 2026-04-20 00:45:51.165996 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.66s 2026-04-20 00:45:51.166003 | orchestrator | Combine JSON from _db/wal/db_wal_vgs_cmd_output ------------------------- 0.65s 2026-04-20 00:45:51.166045 | orchestrator | Create dict of block VGs -> PVs from ceph_osd_devices ------------------- 0.65s 2026-04-20 00:45:51.166053 | orchestrator | Get initial list of available block devices ----------------------------- 0.63s 2026-04-20 00:45:51.166060 | orchestrator | Print 'Create DB LVs for ceph_db_devices' ------------------------------- 0.62s 2026-04-20 00:46:02.776658 | orchestrator | 2026-04-20 00:46:02 | INFO  | Prepare task for execution of facts. 2026-04-20 00:46:02.843351 | orchestrator | 2026-04-20 00:46:02 | INFO  | Task 825ebf53-eb8d-4e04-adf3-51d218336879 (facts) was prepared for execution. 2026-04-20 00:46:02.843429 | orchestrator | 2026-04-20 00:46:02 | INFO  | It takes a moment until task 825ebf53-eb8d-4e04-adf3-51d218336879 (facts) has been started and output is visible here. 2026-04-20 00:46:12.975317 | orchestrator | 2026-04-20 00:46:12.975418 | orchestrator | PLAY [Apply role facts] ******************************************************** 2026-04-20 00:46:12.975437 | orchestrator | 2026-04-20 00:46:12.975448 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2026-04-20 00:46:12.975496 | orchestrator | Monday 20 April 2026 00:46:05 +0000 (0:00:00.299) 0:00:00.299 ********** 2026-04-20 00:46:12.975528 | orchestrator | ok: [testbed-manager] 2026-04-20 00:46:12.975538 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:46:12.975547 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:46:12.975556 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:46:12.975565 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:46:12.975574 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:46:12.975588 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:46:12.975602 | orchestrator | 2026-04-20 00:46:12.975617 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2026-04-20 00:46:12.975632 | orchestrator | Monday 20 April 2026 00:46:07 +0000 (0:00:01.163) 0:00:01.463 ********** 2026-04-20 00:46:12.975648 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:12.975664 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:46:12.975680 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:46:12.975690 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:46:12.975699 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:46:12.975707 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:46:12.975715 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:46:12.975724 | orchestrator | 2026-04-20 00:46:12.975733 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2026-04-20 00:46:12.975741 | orchestrator | 2026-04-20 00:46:12.975750 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2026-04-20 00:46:12.975758 | orchestrator | Monday 20 April 2026 00:46:07 +0000 (0:00:00.928) 0:00:02.392 ********** 2026-04-20 00:46:12.975767 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:46:12.975775 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:46:12.975784 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:46:12.975793 | orchestrator | ok: [testbed-manager] 2026-04-20 00:46:12.975802 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:46:12.975817 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:46:12.975832 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:46:12.975848 | orchestrator | 2026-04-20 00:46:12.975863 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2026-04-20 00:46:12.975878 | orchestrator | 2026-04-20 00:46:12.975890 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2026-04-20 00:46:12.975900 | orchestrator | Monday 20 April 2026 00:46:12 +0000 (0:00:04.258) 0:00:06.651 ********** 2026-04-20 00:46:12.975910 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:12.975919 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:46:12.975930 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:46:12.975964 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:46:12.975980 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:46:12.975996 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:46:12.976012 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:46:12.976027 | orchestrator | 2026-04-20 00:46:12.976043 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:46:12.976059 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976076 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976092 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976107 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976123 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976132 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976141 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:46:12.976150 | orchestrator | 2026-04-20 00:46:12.976158 | orchestrator | 2026-04-20 00:46:12.976167 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:46:12.976176 | orchestrator | Monday 20 April 2026 00:46:12 +0000 (0:00:00.511) 0:00:07.162 ********** 2026-04-20 00:46:12.976185 | orchestrator | =============================================================================== 2026-04-20 00:46:12.976193 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.26s 2026-04-20 00:46:12.976201 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.16s 2026-04-20 00:46:12.976210 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 0.93s 2026-04-20 00:46:12.976219 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.51s 2026-04-20 00:46:24.364193 | orchestrator | 2026-04-20 00:46:24 | INFO  | Prepare task for execution of frr. 2026-04-20 00:46:24.443070 | orchestrator | 2026-04-20 00:46:24 | INFO  | Task 8d1325a3-2b0e-401f-b9bf-ab4ff3361968 (frr) was prepared for execution. 2026-04-20 00:46:24.443171 | orchestrator | 2026-04-20 00:46:24 | INFO  | It takes a moment until task 8d1325a3-2b0e-401f-b9bf-ab4ff3361968 (frr) has been started and output is visible here. 2026-04-20 00:46:46.766577 | orchestrator | 2026-04-20 00:46:46.766688 | orchestrator | PLAY [Apply role frr] ********************************************************** 2026-04-20 00:46:46.766700 | orchestrator | 2026-04-20 00:46:46.766707 | orchestrator | TASK [osism.services.frr : Include distribution specific install tasks] ******** 2026-04-20 00:46:46.766714 | orchestrator | Monday 20 April 2026 00:46:27 +0000 (0:00:00.333) 0:00:00.333 ********** 2026-04-20 00:46:46.766721 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/frr/tasks/install-Debian-family.yml for testbed-manager 2026-04-20 00:46:46.766730 | orchestrator | 2026-04-20 00:46:46.766736 | orchestrator | TASK [osism.services.frr : Pin frr package version] **************************** 2026-04-20 00:46:46.766740 | orchestrator | Monday 20 April 2026 00:46:27 +0000 (0:00:00.212) 0:00:00.546 ********** 2026-04-20 00:46:46.766745 | orchestrator | changed: [testbed-manager] 2026-04-20 00:46:46.766749 | orchestrator | 2026-04-20 00:46:46.766753 | orchestrator | TASK [osism.services.frr : Install frr package] ******************************** 2026-04-20 00:46:46.766758 | orchestrator | Monday 20 April 2026 00:46:29 +0000 (0:00:01.493) 0:00:02.039 ********** 2026-04-20 00:46:46.766785 | orchestrator | changed: [testbed-manager] 2026-04-20 00:46:46.766789 | orchestrator | 2026-04-20 00:46:46.766793 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/vtysh.conf] ********************* 2026-04-20 00:46:46.766797 | orchestrator | Monday 20 April 2026 00:46:37 +0000 (0:00:08.201) 0:00:10.240 ********** 2026-04-20 00:46:46.766801 | orchestrator | ok: [testbed-manager] 2026-04-20 00:46:46.766806 | orchestrator | 2026-04-20 00:46:46.766810 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/daemons] ************************ 2026-04-20 00:46:46.766815 | orchestrator | Monday 20 April 2026 00:46:38 +0000 (0:00:00.915) 0:00:11.156 ********** 2026-04-20 00:46:46.766819 | orchestrator | changed: [testbed-manager] 2026-04-20 00:46:46.766823 | orchestrator | 2026-04-20 00:46:46.766827 | orchestrator | TASK [osism.services.frr : Set _frr_uplinks fact] ****************************** 2026-04-20 00:46:46.766831 | orchestrator | Monday 20 April 2026 00:46:39 +0000 (0:00:00.829) 0:00:11.985 ********** 2026-04-20 00:46:46.766834 | orchestrator | ok: [testbed-manager] 2026-04-20 00:46:46.766838 | orchestrator | 2026-04-20 00:46:46.766842 | orchestrator | TASK [osism.services.frr : Write frr_config_template to temporary file] ******** 2026-04-20 00:46:46.766846 | orchestrator | Monday 20 April 2026 00:46:40 +0000 (0:00:01.150) 0:00:13.136 ********** 2026-04-20 00:46:46.766849 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:46.766853 | orchestrator | 2026-04-20 00:46:46.766857 | orchestrator | TASK [osism.services.frr : Render frr.conf from frr_config_template variable] *** 2026-04-20 00:46:46.766861 | orchestrator | Monday 20 April 2026 00:46:40 +0000 (0:00:00.177) 0:00:13.313 ********** 2026-04-20 00:46:46.766864 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:46.766868 | orchestrator | 2026-04-20 00:46:46.766872 | orchestrator | TASK [osism.services.frr : Remove temporary frr_config_template file] ********** 2026-04-20 00:46:46.766875 | orchestrator | Monday 20 April 2026 00:46:40 +0000 (0:00:00.244) 0:00:13.557 ********** 2026-04-20 00:46:46.766879 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:46.766883 | orchestrator | 2026-04-20 00:46:46.766901 | orchestrator | TASK [osism.services.frr : Check for frr.conf file in the configuration repository] *** 2026-04-20 00:46:46.766905 | orchestrator | Monday 20 April 2026 00:46:41 +0000 (0:00:00.146) 0:00:13.704 ********** 2026-04-20 00:46:46.766909 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:46.766913 | orchestrator | 2026-04-20 00:46:46.766917 | orchestrator | TASK [osism.services.frr : Copy frr.conf file from the configuration repository] *** 2026-04-20 00:46:46.766921 | orchestrator | Monday 20 April 2026 00:46:41 +0000 (0:00:00.117) 0:00:13.822 ********** 2026-04-20 00:46:46.766925 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:46:46.766928 | orchestrator | 2026-04-20 00:46:46.766932 | orchestrator | TASK [osism.services.frr : Copy default frr.conf file of type k3s_cilium] ****** 2026-04-20 00:46:46.766936 | orchestrator | Monday 20 April 2026 00:46:41 +0000 (0:00:00.145) 0:00:13.968 ********** 2026-04-20 00:46:46.766940 | orchestrator | changed: [testbed-manager] 2026-04-20 00:46:46.766944 | orchestrator | 2026-04-20 00:46:46.766947 | orchestrator | TASK [osism.services.frr : Set sysctl parameters] ****************************** 2026-04-20 00:46:46.766951 | orchestrator | Monday 20 April 2026 00:46:42 +0000 (0:00:00.888) 0:00:14.856 ********** 2026-04-20 00:46:46.766955 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.ip_forward', 'value': 1}) 2026-04-20 00:46:46.766959 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.send_redirects', 'value': 0}) 2026-04-20 00:46:46.766964 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.accept_redirects', 'value': 0}) 2026-04-20 00:46:46.766968 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.fib_multipath_hash_policy', 'value': 1}) 2026-04-20 00:46:46.766971 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.default.ignore_routes_with_linkdown', 'value': 1}) 2026-04-20 00:46:46.766975 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.rp_filter', 'value': 2}) 2026-04-20 00:46:46.766979 | orchestrator | 2026-04-20 00:46:46.766983 | orchestrator | TASK [osism.services.frr : Manage frr service] ********************************* 2026-04-20 00:46:46.766995 | orchestrator | Monday 20 April 2026 00:46:44 +0000 (0:00:01.951) 0:00:16.808 ********** 2026-04-20 00:46:46.767002 | orchestrator | ok: [testbed-manager] 2026-04-20 00:46:46.767005 | orchestrator | 2026-04-20 00:46:46.767009 | orchestrator | RUNNING HANDLER [osism.services.frr : Restart frr service] ********************* 2026-04-20 00:46:46.767013 | orchestrator | Monday 20 April 2026 00:46:45 +0000 (0:00:01.045) 0:00:17.854 ********** 2026-04-20 00:46:46.767017 | orchestrator | changed: [testbed-manager] 2026-04-20 00:46:46.767021 | orchestrator | 2026-04-20 00:46:46.767024 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:46:46.767029 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2026-04-20 00:46:46.767033 | orchestrator | 2026-04-20 00:46:46.767037 | orchestrator | 2026-04-20 00:46:46.767053 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:46:46.767057 | orchestrator | Monday 20 April 2026 00:46:46 +0000 (0:00:01.278) 0:00:19.133 ********** 2026-04-20 00:46:46.767062 | orchestrator | =============================================================================== 2026-04-20 00:46:46.767066 | orchestrator | osism.services.frr : Install frr package -------------------------------- 8.20s 2026-04-20 00:46:46.767070 | orchestrator | osism.services.frr : Set sysctl parameters ------------------------------ 1.95s 2026-04-20 00:46:46.767075 | orchestrator | osism.services.frr : Pin frr package version ---------------------------- 1.49s 2026-04-20 00:46:46.767079 | orchestrator | osism.services.frr : Restart frr service -------------------------------- 1.28s 2026-04-20 00:46:46.767083 | orchestrator | osism.services.frr : Set _frr_uplinks fact ------------------------------ 1.15s 2026-04-20 00:46:46.767088 | orchestrator | osism.services.frr : Manage frr service --------------------------------- 1.05s 2026-04-20 00:46:46.767092 | orchestrator | osism.services.frr : Copy file: /etc/frr/vtysh.conf --------------------- 0.92s 2026-04-20 00:46:46.767096 | orchestrator | osism.services.frr : Copy default frr.conf file of type k3s_cilium ------ 0.89s 2026-04-20 00:46:46.767100 | orchestrator | osism.services.frr : Copy file: /etc/frr/daemons ------------------------ 0.83s 2026-04-20 00:46:46.767105 | orchestrator | osism.services.frr : Render frr.conf from frr_config_template variable --- 0.24s 2026-04-20 00:46:46.767109 | orchestrator | osism.services.frr : Include distribution specific install tasks -------- 0.21s 2026-04-20 00:46:46.767115 | orchestrator | osism.services.frr : Write frr_config_template to temporary file -------- 0.18s 2026-04-20 00:46:46.767121 | orchestrator | osism.services.frr : Remove temporary frr_config_template file ---------- 0.15s 2026-04-20 00:46:46.767129 | orchestrator | osism.services.frr : Copy frr.conf file from the configuration repository --- 0.15s 2026-04-20 00:46:46.767138 | orchestrator | osism.services.frr : Check for frr.conf file in the configuration repository --- 0.12s 2026-04-20 00:46:46.900332 | orchestrator | 2026-04-20 00:46:46.901790 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Mon Apr 20 00:46:46 UTC 2026 2026-04-20 00:46:46.901849 | orchestrator | 2026-04-20 00:46:47.916760 | orchestrator | 2026-04-20 00:46:47 | INFO  | Collection nutshell is prepared for execution 2026-04-20 00:46:48.017693 | orchestrator | 2026-04-20 00:46:48 | INFO  | A [0] - dotfiles 2026-04-20 00:46:58.046987 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - homer 2026-04-20 00:46:58.047582 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - netdata 2026-04-20 00:46:58.047601 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - openstackclient 2026-04-20 00:46:58.047607 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - phpmyadmin 2026-04-20 00:46:58.047612 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - common 2026-04-20 00:46:58.051063 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- loadbalancer 2026-04-20 00:46:58.051235 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [2] --- opensearch 2026-04-20 00:46:58.051250 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [2] --- mariadb-ng 2026-04-20 00:46:58.051323 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [3] ---- horizon 2026-04-20 00:46:58.051755 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [3] ---- keystone 2026-04-20 00:46:58.051791 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- neutron 2026-04-20 00:46:58.051798 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [5] ------ wait-for-nova 2026-04-20 00:46:58.051806 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [6] ------- octavia 2026-04-20 00:46:58.053291 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- barbican 2026-04-20 00:46:58.053361 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- designate 2026-04-20 00:46:58.053414 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- ironic 2026-04-20 00:46:58.053619 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- placement 2026-04-20 00:46:58.053628 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- magnum 2026-04-20 00:46:58.054219 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- openvswitch 2026-04-20 00:46:58.054244 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [2] --- ovn 2026-04-20 00:46:58.054635 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- memcached 2026-04-20 00:46:58.054657 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- redis 2026-04-20 00:46:58.054741 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- rabbitmq-ng 2026-04-20 00:46:58.055123 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - kubernetes 2026-04-20 00:46:58.057421 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- kubeconfig 2026-04-20 00:46:58.057471 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- copy-kubeconfig 2026-04-20 00:46:58.057517 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [0] - ceph 2026-04-20 00:46:58.059208 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [1] -- ceph-pools 2026-04-20 00:46:58.059265 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [2] --- copy-ceph-keys 2026-04-20 00:46:58.059472 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [3] ---- cephclient 2026-04-20 00:46:58.059537 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- ceph-bootstrap-dashboard 2026-04-20 00:46:58.059798 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- wait-for-keystone 2026-04-20 00:46:58.059908 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [5] ------ kolla-ceph-rgw 2026-04-20 00:46:58.059921 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [5] ------ glance 2026-04-20 00:46:58.059929 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [5] ------ cinder 2026-04-20 00:46:58.059935 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [5] ------ nova 2026-04-20 00:46:58.060026 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [4] ----- prometheus 2026-04-20 00:46:58.060401 | orchestrator | 2026-04-20 00:46:58 | INFO  | A [5] ------ grafana 2026-04-20 00:46:58.264383 | orchestrator | 2026-04-20 00:46:58 | INFO  | All tasks of the collection nutshell are prepared for execution 2026-04-20 00:46:58.264459 | orchestrator | 2026-04-20 00:46:58 | INFO  | Tasks are running in the background 2026-04-20 00:46:59.798540 | orchestrator | 2026-04-20 00:46:59 | INFO  | No task IDs specified, wait for all currently running tasks 2026-04-20 00:47:02.025031 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:02.025962 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:02.026608 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:02.027698 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:02.031206 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:02.031976 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:02.032979 | orchestrator | 2026-04-20 00:47:02 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:02.032998 | orchestrator | 2026-04-20 00:47:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:05.075636 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:05.079181 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:05.079237 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:05.079245 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:05.082106 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:05.084080 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:05.084138 | orchestrator | 2026-04-20 00:47:05 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:05.084144 | orchestrator | 2026-04-20 00:47:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:08.137147 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:08.137581 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:08.138205 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:08.138885 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:08.139461 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:08.140384 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:08.141099 | orchestrator | 2026-04-20 00:47:08 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:08.141153 | orchestrator | 2026-04-20 00:47:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:11.217475 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:11.217621 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:11.217634 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:11.217640 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:11.217648 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:11.217654 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:11.217662 | orchestrator | 2026-04-20 00:47:11 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:11.217699 | orchestrator | 2026-04-20 00:47:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:14.291420 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:14.293457 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:14.293624 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:14.319150 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:14.319219 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:14.319228 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:14.319234 | orchestrator | 2026-04-20 00:47:14 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:14.319243 | orchestrator | 2026-04-20 00:47:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:18.092123 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:18.092185 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:18.092196 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:18.092204 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:18.092214 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:18.092221 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:18.092229 | orchestrator | 2026-04-20 00:47:17 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:18.092236 | orchestrator | 2026-04-20 00:47:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:21.245383 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:21.245489 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:21.245543 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state STARTED 2026-04-20 00:47:21.247103 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:21.247284 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:21.248578 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:21.249623 | orchestrator | 2026-04-20 00:47:21 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:21.249668 | orchestrator | 2026-04-20 00:47:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:24.408601 | orchestrator | 2026-04-20 00:47:24.408653 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2026-04-20 00:47:24.408664 | orchestrator | 2026-04-20 00:47:24.408672 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2026-04-20 00:47:24.408683 | orchestrator | Monday 20 April 2026 00:47:08 +0000 (0:00:00.853) 0:00:00.853 ********** 2026-04-20 00:47:24.408705 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:47:24.408712 | orchestrator | changed: [testbed-manager] 2026-04-20 00:47:24.408718 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:47:24.408724 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:47:24.408731 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:47:24.408735 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:47:24.408739 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:47:24.408742 | orchestrator | 2026-04-20 00:47:24.408746 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2026-04-20 00:47:24.408750 | orchestrator | Monday 20 April 2026 00:47:13 +0000 (0:00:05.000) 0:00:05.853 ********** 2026-04-20 00:47:24.408754 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2026-04-20 00:47:24.408758 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2026-04-20 00:47:24.408762 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2026-04-20 00:47:24.408766 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2026-04-20 00:47:24.408769 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2026-04-20 00:47:24.408773 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2026-04-20 00:47:24.408777 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2026-04-20 00:47:24.408781 | orchestrator | 2026-04-20 00:47:24.408784 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2026-04-20 00:47:24.408788 | orchestrator | Monday 20 April 2026 00:47:16 +0000 (0:00:02.657) 0:00:08.511 ********** 2026-04-20 00:47:24.408795 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:14.544443', 'end': '2026-04-20 00:47:14.551846', 'delta': '0:00:00.007403', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408803 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:14.426064', 'end': '2026-04-20 00:47:14.433321', 'delta': '0:00:00.007257', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408807 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:14.479324', 'end': '2026-04-20 00:47:14.486694', 'delta': '0:00:00.007370', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408830 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:14.761271', 'end': '2026-04-20 00:47:15.770101', 'delta': '0:00:01.008830', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408834 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:15.284178', 'end': '2026-04-20 00:47:15.290443', 'delta': '0:00:00.006265', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408838 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:15.370229', 'end': '2026-04-20 00:47:15.378271', 'delta': '0:00:00.008042', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408967 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2026-04-20 00:47:14.344106', 'end': '2026-04-20 00:47:14.350925', 'delta': '0:00:00.006819', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2026-04-20 00:47:24.408972 | orchestrator | 2026-04-20 00:47:24.408976 | orchestrator | TASK [geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist.] **** 2026-04-20 00:47:24.408980 | orchestrator | Monday 20 April 2026 00:47:18 +0000 (0:00:02.105) 0:00:10.616 ********** 2026-04-20 00:47:24.408984 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2026-04-20 00:47:24.408988 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2026-04-20 00:47:24.408991 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2026-04-20 00:47:24.408995 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2026-04-20 00:47:24.408999 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2026-04-20 00:47:24.409006 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2026-04-20 00:47:24.409010 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2026-04-20 00:47:24.409014 | orchestrator | 2026-04-20 00:47:24.409017 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2026-04-20 00:47:24.409021 | orchestrator | Monday 20 April 2026 00:47:19 +0000 (0:00:01.166) 0:00:11.783 ********** 2026-04-20 00:47:24.409025 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2026-04-20 00:47:24.409029 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2026-04-20 00:47:24.409032 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2026-04-20 00:47:24.409036 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2026-04-20 00:47:24.409040 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2026-04-20 00:47:24.409044 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2026-04-20 00:47:24.409048 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2026-04-20 00:47:24.409051 | orchestrator | 2026-04-20 00:47:24.409055 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:47:24.409063 | orchestrator | testbed-manager : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409067 | orchestrator | testbed-node-0 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409071 | orchestrator | testbed-node-1 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409075 | orchestrator | testbed-node-2 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409079 | orchestrator | testbed-node-3 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409082 | orchestrator | testbed-node-4 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409086 | orchestrator | testbed-node-5 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:47:24.409090 | orchestrator | 2026-04-20 00:47:24.409094 | orchestrator | 2026-04-20 00:47:24.409097 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:47:24.409101 | orchestrator | Monday 20 April 2026 00:47:22 +0000 (0:00:02.903) 0:00:14.686 ********** 2026-04-20 00:47:24.409105 | orchestrator | =============================================================================== 2026-04-20 00:47:24.409109 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 5.00s 2026-04-20 00:47:24.409113 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 2.90s 2026-04-20 00:47:24.409116 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 2.66s 2026-04-20 00:47:24.409120 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 2.10s 2026-04-20 00:47:24.409124 | orchestrator | geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist. ---- 1.17s 2026-04-20 00:47:24.409128 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:24.409131 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:24.409137 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task c2ceed3b-ad23-412e-8c7c-104af11700bb is in state SUCCESS 2026-04-20 00:47:24.409141 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:24.409145 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:24.409151 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:24.409155 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:24.428058 | orchestrator | 2026-04-20 00:47:24 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:24.428106 | orchestrator | 2026-04-20 00:47:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:27.481857 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:27.481932 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:27.484165 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:27.486476 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:27.493067 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:27.493133 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:27.493145 | orchestrator | 2026-04-20 00:47:27 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:27.494325 | orchestrator | 2026-04-20 00:47:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:30.620448 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:30.625649 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:30.627252 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:30.627818 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:30.632345 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:30.632791 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:30.633821 | orchestrator | 2026-04-20 00:47:30 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:30.635014 | orchestrator | 2026-04-20 00:47:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:33.690982 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:33.694356 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:33.694711 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:33.698334 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:33.699194 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:33.700892 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:33.702602 | orchestrator | 2026-04-20 00:47:33 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:33.702660 | orchestrator | 2026-04-20 00:47:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:36.751322 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:36.755573 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:36.761380 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:36.763487 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:36.767396 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:36.770854 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:36.778436 | orchestrator | 2026-04-20 00:47:36 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:36.778492 | orchestrator | 2026-04-20 00:47:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:39.860137 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:39.860761 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:39.864051 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:39.864847 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:39.868292 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:39.872808 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:39.875712 | orchestrator | 2026-04-20 00:47:39 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:39.875759 | orchestrator | 2026-04-20 00:47:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:43.027389 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:43.040768 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:43.063462 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:43.067861 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:43.071040 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:43.074421 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:43.078262 | orchestrator | 2026-04-20 00:47:43 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:43.078318 | orchestrator | 2026-04-20 00:47:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:46.231665 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:46.231728 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:46.231735 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:46.231741 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:46.231765 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:46.231770 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:46.231775 | orchestrator | 2026-04-20 00:47:46 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:46.231780 | orchestrator | 2026-04-20 00:47:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:49.231154 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:49.231216 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state STARTED 2026-04-20 00:47:49.231223 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:49.231228 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:49.231233 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:49.231238 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:49.231243 | orchestrator | 2026-04-20 00:47:49 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:49.231265 | orchestrator | 2026-04-20 00:47:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:52.430933 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:52.430989 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task c9708ee8-9534-4d56-8026-19ed252e6341 is in state SUCCESS 2026-04-20 00:47:52.430994 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:52.430998 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:52.431002 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:52.431005 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:52.431008 | orchestrator | 2026-04-20 00:47:52 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:52.431011 | orchestrator | 2026-04-20 00:47:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:55.549341 | orchestrator | 2026-04-20 00:47:55 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:55.549409 | orchestrator | 2026-04-20 00:47:55 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:55.549423 | orchestrator | 2026-04-20 00:47:55 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:55.549429 | orchestrator | 2026-04-20 00:47:55 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:55.549435 | orchestrator | 2026-04-20 00:47:55 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:55.549441 | orchestrator | 2026-04-20 00:47:55 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:55.549447 | orchestrator | 2026-04-20 00:47:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:47:58.534593 | orchestrator | 2026-04-20 00:47:58 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:47:58.536671 | orchestrator | 2026-04-20 00:47:58 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:47:58.538202 | orchestrator | 2026-04-20 00:47:58 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:47:58.539301 | orchestrator | 2026-04-20 00:47:58 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:47:58.539786 | orchestrator | 2026-04-20 00:47:58 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:47:58.540734 | orchestrator | 2026-04-20 00:47:58 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:47:58.540762 | orchestrator | 2026-04-20 00:47:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:01.818689 | orchestrator | 2026-04-20 00:48:01 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:01.820394 | orchestrator | 2026-04-20 00:48:01 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:48:01.820862 | orchestrator | 2026-04-20 00:48:01 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:01.822705 | orchestrator | 2026-04-20 00:48:01 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state STARTED 2026-04-20 00:48:01.824054 | orchestrator | 2026-04-20 00:48:01 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:01.825214 | orchestrator | 2026-04-20 00:48:01 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:01.825266 | orchestrator | 2026-04-20 00:48:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:04.879239 | orchestrator | 2026-04-20 00:48:04 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:04.949368 | orchestrator | 2026-04-20 00:48:04 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:48:04.949463 | orchestrator | 2026-04-20 00:48:04 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:04.949473 | orchestrator | 2026-04-20 00:48:04 | INFO  | Task 92921e92-e531-4698-91c2-36347756a782 is in state SUCCESS 2026-04-20 00:48:04.949481 | orchestrator | 2026-04-20 00:48:04 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:04.949508 | orchestrator | 2026-04-20 00:48:04 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:04.949517 | orchestrator | 2026-04-20 00:48:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:07.941929 | orchestrator | 2026-04-20 00:48:07 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:07.955081 | orchestrator | 2026-04-20 00:48:07 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:48:07.973897 | orchestrator | 2026-04-20 00:48:07 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:07.973971 | orchestrator | 2026-04-20 00:48:07 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:07.975375 | orchestrator | 2026-04-20 00:48:07 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:07.975434 | orchestrator | 2026-04-20 00:48:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:11.012485 | orchestrator | 2026-04-20 00:48:11 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:11.014901 | orchestrator | 2026-04-20 00:48:11 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:48:11.014978 | orchestrator | 2026-04-20 00:48:11 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:11.019653 | orchestrator | 2026-04-20 00:48:11 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:11.019701 | orchestrator | 2026-04-20 00:48:11 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:11.019707 | orchestrator | 2026-04-20 00:48:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:14.058986 | orchestrator | 2026-04-20 00:48:14 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:14.059879 | orchestrator | 2026-04-20 00:48:14 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:48:14.060224 | orchestrator | 2026-04-20 00:48:14 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:14.061041 | orchestrator | 2026-04-20 00:48:14 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:14.061750 | orchestrator | 2026-04-20 00:48:14 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:14.061836 | orchestrator | 2026-04-20 00:48:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:17.094264 | orchestrator | 2026-04-20 00:48:17 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:17.095521 | orchestrator | 2026-04-20 00:48:17 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state STARTED 2026-04-20 00:48:17.096227 | orchestrator | 2026-04-20 00:48:17 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:17.097067 | orchestrator | 2026-04-20 00:48:17 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:17.097886 | orchestrator | 2026-04-20 00:48:17 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:17.097940 | orchestrator | 2026-04-20 00:48:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:20.139364 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:20.139659 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task dc34de65-0c95-464d-8bc3-a124a6469e3e is in state STARTED 2026-04-20 00:48:20.146318 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task c05c8a7e-60b2-471d-a4a1-d7d2ed70c724 is in state SUCCESS 2026-04-20 00:48:20.147349 | orchestrator | 2026-04-20 00:48:20.147390 | orchestrator | 2026-04-20 00:48:20.147477 | orchestrator | PLAY [Apply role homer] ******************************************************** 2026-04-20 00:48:20.147489 | orchestrator | 2026-04-20 00:48:20.147499 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2026-04-20 00:48:20.147510 | orchestrator | Monday 20 April 2026 00:47:08 +0000 (0:00:00.798) 0:00:00.798 ********** 2026-04-20 00:48:20.147520 | orchestrator | ok: [testbed-manager] => { 2026-04-20 00:48:20.147577 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2026-04-20 00:48:20.147589 | orchestrator | } 2026-04-20 00:48:20.147600 | orchestrator | 2026-04-20 00:48:20.147610 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2026-04-20 00:48:20.147619 | orchestrator | Monday 20 April 2026 00:47:09 +0000 (0:00:00.715) 0:00:01.513 ********** 2026-04-20 00:48:20.147629 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:20.147639 | orchestrator | 2026-04-20 00:48:20.147649 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2026-04-20 00:48:20.147659 | orchestrator | Monday 20 April 2026 00:47:12 +0000 (0:00:03.002) 0:00:04.516 ********** 2026-04-20 00:48:20.147669 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2026-04-20 00:48:20.147703 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2026-04-20 00:48:20.147714 | orchestrator | 2026-04-20 00:48:20.148570 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2026-04-20 00:48:20.148591 | orchestrator | Monday 20 April 2026 00:47:15 +0000 (0:00:02.354) 0:00:06.870 ********** 2026-04-20 00:48:20.148602 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.148612 | orchestrator | 2026-04-20 00:48:20.148623 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2026-04-20 00:48:20.148633 | orchestrator | Monday 20 April 2026 00:47:17 +0000 (0:00:02.358) 0:00:09.228 ********** 2026-04-20 00:48:20.148643 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.148653 | orchestrator | 2026-04-20 00:48:20.148663 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2026-04-20 00:48:20.148673 | orchestrator | Monday 20 April 2026 00:47:20 +0000 (0:00:02.657) 0:00:11.886 ********** 2026-04-20 00:48:20.148683 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2026-04-20 00:48:20.148694 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:20.148704 | orchestrator | 2026-04-20 00:48:20.148714 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2026-04-20 00:48:20.148724 | orchestrator | Monday 20 April 2026 00:47:47 +0000 (0:00:27.174) 0:00:39.061 ********** 2026-04-20 00:48:20.148735 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.148745 | orchestrator | 2026-04-20 00:48:20.148755 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:20.148765 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:20.148777 | orchestrator | 2026-04-20 00:48:20.148787 | orchestrator | 2026-04-20 00:48:20.148797 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:20.148808 | orchestrator | Monday 20 April 2026 00:47:49 +0000 (0:00:02.581) 0:00:41.642 ********** 2026-04-20 00:48:20.148819 | orchestrator | =============================================================================== 2026-04-20 00:48:20.148829 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 27.17s 2026-04-20 00:48:20.148839 | orchestrator | osism.services.homer : Create traefik external network ------------------ 3.00s 2026-04-20 00:48:20.148849 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 2.66s 2026-04-20 00:48:20.148859 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 2.58s 2026-04-20 00:48:20.148869 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 2.36s 2026-04-20 00:48:20.148879 | orchestrator | osism.services.homer : Create required directories ---------------------- 2.35s 2026-04-20 00:48:20.148890 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.72s 2026-04-20 00:48:20.148900 | orchestrator | 2026-04-20 00:48:20.148910 | orchestrator | 2026-04-20 00:48:20.148920 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2026-04-20 00:48:20.148930 | orchestrator | 2026-04-20 00:48:20.148940 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2026-04-20 00:48:20.149043 | orchestrator | Monday 20 April 2026 00:47:10 +0000 (0:00:01.634) 0:00:01.634 ********** 2026-04-20 00:48:20.149055 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2026-04-20 00:48:20.149144 | orchestrator | 2026-04-20 00:48:20.149156 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2026-04-20 00:48:20.149166 | orchestrator | Monday 20 April 2026 00:47:10 +0000 (0:00:00.368) 0:00:02.003 ********** 2026-04-20 00:48:20.149175 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2026-04-20 00:48:20.149185 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2026-04-20 00:48:20.150224 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2026-04-20 00:48:20.150271 | orchestrator | 2026-04-20 00:48:20.150278 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2026-04-20 00:48:20.150283 | orchestrator | Monday 20 April 2026 00:47:13 +0000 (0:00:02.578) 0:00:04.581 ********** 2026-04-20 00:48:20.150291 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.150299 | orchestrator | 2026-04-20 00:48:20.150306 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2026-04-20 00:48:20.150314 | orchestrator | Monday 20 April 2026 00:47:15 +0000 (0:00:02.515) 0:00:07.096 ********** 2026-04-20 00:48:20.150337 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2026-04-20 00:48:20.150346 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:20.150352 | orchestrator | 2026-04-20 00:48:20.150381 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2026-04-20 00:48:20.150396 | orchestrator | Monday 20 April 2026 00:47:50 +0000 (0:00:35.295) 0:00:42.392 ********** 2026-04-20 00:48:20.150400 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.150409 | orchestrator | 2026-04-20 00:48:20.150414 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2026-04-20 00:48:20.150418 | orchestrator | Monday 20 April 2026 00:47:53 +0000 (0:00:02.704) 0:00:45.096 ********** 2026-04-20 00:48:20.150423 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:20.150427 | orchestrator | 2026-04-20 00:48:20.150432 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2026-04-20 00:48:20.150437 | orchestrator | Monday 20 April 2026 00:47:55 +0000 (0:00:01.396) 0:00:46.493 ********** 2026-04-20 00:48:20.150441 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.150445 | orchestrator | 2026-04-20 00:48:20.150450 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2026-04-20 00:48:20.150454 | orchestrator | Monday 20 April 2026 00:47:58 +0000 (0:00:03.636) 0:00:50.130 ********** 2026-04-20 00:48:20.150461 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.150466 | orchestrator | 2026-04-20 00:48:20.150470 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2026-04-20 00:48:20.150474 | orchestrator | Monday 20 April 2026 00:48:00 +0000 (0:00:01.592) 0:00:51.723 ********** 2026-04-20 00:48:20.150478 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.150483 | orchestrator | 2026-04-20 00:48:20.150487 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2026-04-20 00:48:20.150492 | orchestrator | Monday 20 April 2026 00:48:01 +0000 (0:00:01.310) 0:00:53.033 ********** 2026-04-20 00:48:20.150496 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:20.150500 | orchestrator | 2026-04-20 00:48:20.150504 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:20.150509 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:20.150514 | orchestrator | 2026-04-20 00:48:20.150518 | orchestrator | 2026-04-20 00:48:20.150523 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:20.150527 | orchestrator | Monday 20 April 2026 00:48:02 +0000 (0:00:00.932) 0:00:53.967 ********** 2026-04-20 00:48:20.150542 | orchestrator | =============================================================================== 2026-04-20 00:48:20.150549 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 35.30s 2026-04-20 00:48:20.150556 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 3.64s 2026-04-20 00:48:20.150564 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 2.70s 2026-04-20 00:48:20.150571 | orchestrator | osism.services.openstackclient : Create required directories ------------ 2.58s 2026-04-20 00:48:20.150577 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 2.52s 2026-04-20 00:48:20.150586 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 1.59s 2026-04-20 00:48:20.150603 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 1.40s 2026-04-20 00:48:20.150611 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 1.31s 2026-04-20 00:48:20.150618 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 0.93s 2026-04-20 00:48:20.150624 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.37s 2026-04-20 00:48:20.150631 | orchestrator | 2026-04-20 00:48:20.150637 | orchestrator | 2026-04-20 00:48:20.150644 | orchestrator | PLAY [Apply role common] ******************************************************* 2026-04-20 00:48:20.150651 | orchestrator | 2026-04-20 00:48:20.150658 | orchestrator | TASK [common : include_tasks] ************************************************** 2026-04-20 00:48:20.150666 | orchestrator | Monday 20 April 2026 00:47:01 +0000 (0:00:00.484) 0:00:00.484 ********** 2026-04-20 00:48:20.150673 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:48:20.150681 | orchestrator | 2026-04-20 00:48:20.150686 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2026-04-20 00:48:20.150690 | orchestrator | Monday 20 April 2026 00:47:03 +0000 (0:00:01.220) 0:00:01.705 ********** 2026-04-20 00:48:20.150694 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150699 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150704 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150708 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150712 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150716 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150721 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150725 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150729 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150734 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150738 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2026-04-20 00:48:20.150750 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150758 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150765 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150772 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150779 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150783 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2026-04-20 00:48:20.150787 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150792 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150796 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150804 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2026-04-20 00:48:20.150808 | orchestrator | 2026-04-20 00:48:20.150812 | orchestrator | TASK [common : include_tasks] ************************************************** 2026-04-20 00:48:20.150817 | orchestrator | Monday 20 April 2026 00:47:07 +0000 (0:00:04.274) 0:00:05.979 ********** 2026-04-20 00:48:20.150821 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:48:20.150830 | orchestrator | 2026-04-20 00:48:20.150835 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2026-04-20 00:48:20.150839 | orchestrator | Monday 20 April 2026 00:47:08 +0000 (0:00:01.425) 0:00:07.405 ********** 2026-04-20 00:48:20.150845 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150852 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150857 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150862 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150870 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150875 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150882 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.150890 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150895 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150900 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150904 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150913 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150919 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150927 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150935 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150940 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150945 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150949 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150954 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.150997 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151003 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151011 | orchestrator | 2026-04-20 00:48:20.151015 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2026-04-20 00:48:20.151020 | orchestrator | Monday 20 April 2026 00:47:15 +0000 (0:00:06.146) 0:00:13.552 ********** 2026-04-20 00:48:20.151035 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151041 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151045 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151055 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151059 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151064 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151085 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151093 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:48:20.151103 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151108 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151113 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151117 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151122 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151126 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:20.151131 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151155 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151165 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:20.151169 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:20.151174 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151181 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151186 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151190 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:20.151195 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151199 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:20.151204 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151209 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151213 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151231 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:20.151236 | orchestrator | 2026-04-20 00:48:20.151241 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2026-04-20 00:48:20.151245 | orchestrator | Monday 20 April 2026 00:47:18 +0000 (0:00:03.498) 0:00:17.051 ********** 2026-04-20 00:48:20.151249 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151256 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151261 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151265 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151270 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151274 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151295 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151301 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:20.151305 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151312 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151316 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151321 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151326 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151330 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151338 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151342 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:48:20.151347 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:20.151351 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:20.151366 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151373 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151377 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151382 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:20.151386 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151391 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:20.151395 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.151400 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151407 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.151411 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:20.151416 | orchestrator | 2026-04-20 00:48:20.151420 | orchestrator | TASK [common : Ensure /var/log/journal exists on EL10 systems] ***************** 2026-04-20 00:48:20.151425 | orchestrator | Monday 20 April 2026 00:47:24 +0000 (0:00:05.820) 0:00:22.871 ********** 2026-04-20 00:48:20.151429 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:48:20.151433 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:20.151438 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:20.151442 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:20.151446 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:20.151451 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:20.151455 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:20.151459 | orchestrator | 2026-04-20 00:48:20.151473 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2026-04-20 00:48:20.151478 | orchestrator | Monday 20 April 2026 00:47:26 +0000 (0:00:01.949) 0:00:24.821 ********** 2026-04-20 00:48:20.151483 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:48:20.151487 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:20.151491 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:20.151496 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:20.151500 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:20.151504 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:20.151509 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:20.151513 | orchestrator | 2026-04-20 00:48:20.151517 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2026-04-20 00:48:20.151521 | orchestrator | Monday 20 April 2026 00:47:27 +0000 (0:00:01.075) 0:00:25.897 ********** 2026-04-20 00:48:20.151526 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:48:20.151543 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:20.151548 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:20.151552 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:20.151557 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:20.151561 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:20.151565 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:20.151569 | orchestrator | 2026-04-20 00:48:20.151576 | orchestrator | TASK [common : Copying over kolla.target] ************************************** 2026-04-20 00:48:20.151580 | orchestrator | Monday 20 April 2026 00:47:28 +0000 (0:00:01.102) 0:00:26.999 ********** 2026-04-20 00:48:20.151585 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:20.151589 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:20.151593 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:20.151597 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:20.151602 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.151606 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:20.151610 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:20.151617 | orchestrator | 2026-04-20 00:48:20.151628 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2026-04-20 00:48:20.151637 | orchestrator | Monday 20 April 2026 00:47:30 +0000 (0:00:02.271) 0:00:29.270 ********** 2026-04-20 00:48:20.151645 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151657 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151664 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151672 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151695 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151703 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151717 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151734 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151743 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151751 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151759 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.151767 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151796 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151808 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151816 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151832 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151840 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151845 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151849 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151856 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151881 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.151890 | orchestrator | 2026-04-20 00:48:20.151897 | orchestrator | TASK [common : Find custom fluentd input config files] ************************* 2026-04-20 00:48:20.151905 | orchestrator | Monday 20 April 2026 00:47:36 +0000 (0:00:05.876) 0:00:35.147 ********** 2026-04-20 00:48:20.151913 | orchestrator | [WARNING]: Skipped 2026-04-20 00:48:20.151921 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' path due 2026-04-20 00:48:20.151934 | orchestrator | to this access issue: 2026-04-20 00:48:20.151938 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' is not a 2026-04-20 00:48:20.151943 | orchestrator | directory 2026-04-20 00:48:20.151947 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 00:48:20.151952 | orchestrator | 2026-04-20 00:48:20.151956 | orchestrator | TASK [common : Find custom fluentd filter config files] ************************ 2026-04-20 00:48:20.151961 | orchestrator | Monday 20 April 2026 00:47:37 +0000 (0:00:01.071) 0:00:36.218 ********** 2026-04-20 00:48:20.151965 | orchestrator | [WARNING]: Skipped 2026-04-20 00:48:20.151969 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' path due 2026-04-20 00:48:20.151974 | orchestrator | to this access issue: 2026-04-20 00:48:20.151978 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' is not a 2026-04-20 00:48:20.151982 | orchestrator | directory 2026-04-20 00:48:20.151987 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 00:48:20.151991 | orchestrator | 2026-04-20 00:48:20.151995 | orchestrator | TASK [common : Find custom fluentd format config files] ************************ 2026-04-20 00:48:20.152000 | orchestrator | Monday 20 April 2026 00:47:38 +0000 (0:00:01.040) 0:00:37.259 ********** 2026-04-20 00:48:20.152004 | orchestrator | [WARNING]: Skipped 2026-04-20 00:48:20.152008 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' path due 2026-04-20 00:48:20.152012 | orchestrator | to this access issue: 2026-04-20 00:48:20.152017 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' is not a 2026-04-20 00:48:20.152021 | orchestrator | directory 2026-04-20 00:48:20.152025 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 00:48:20.152030 | orchestrator | 2026-04-20 00:48:20.152037 | orchestrator | TASK [common : Find custom fluentd output config files] ************************ 2026-04-20 00:48:20.152041 | orchestrator | Monday 20 April 2026 00:47:39 +0000 (0:00:00.891) 0:00:38.150 ********** 2026-04-20 00:48:20.152045 | orchestrator | [WARNING]: Skipped 2026-04-20 00:48:20.152050 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' path due 2026-04-20 00:48:20.152054 | orchestrator | to this access issue: 2026-04-20 00:48:20.152058 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' is not a 2026-04-20 00:48:20.152063 | orchestrator | directory 2026-04-20 00:48:20.152067 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 00:48:20.152071 | orchestrator | 2026-04-20 00:48:20.152076 | orchestrator | TASK [common : Copying over fluentd.conf] ************************************** 2026-04-20 00:48:20.152080 | orchestrator | Monday 20 April 2026 00:47:40 +0000 (0:00:00.986) 0:00:39.137 ********** 2026-04-20 00:48:20.152084 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.152088 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:20.152093 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:20.152097 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:20.152101 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:20.152105 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:20.152110 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:20.152114 | orchestrator | 2026-04-20 00:48:20.152118 | orchestrator | TASK [common : Copying over cron logrotate config file] ************************ 2026-04-20 00:48:20.152123 | orchestrator | Monday 20 April 2026 00:47:46 +0000 (0:00:06.146) 0:00:45.284 ********** 2026-04-20 00:48:20.152127 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152131 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152136 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152140 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152144 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152152 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152156 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2026-04-20 00:48:20.152160 | orchestrator | 2026-04-20 00:48:20.152165 | orchestrator | TASK [common : Ensure RabbitMQ Erlang cookie exists] *************************** 2026-04-20 00:48:20.152169 | orchestrator | Monday 20 April 2026 00:47:50 +0000 (0:00:03.595) 0:00:48.879 ********** 2026-04-20 00:48:20.152173 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:20.152177 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:20.152182 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:20.152186 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:20.152190 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:20.152194 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:20.152202 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.152206 | orchestrator | 2026-04-20 00:48:20.152211 | orchestrator | TASK [common : Ensuring config directories have correct owner and permission] *** 2026-04-20 00:48:20.152215 | orchestrator | Monday 20 April 2026 00:47:53 +0000 (0:00:03.257) 0:00:52.137 ********** 2026-04-20 00:48:20.152219 | orchestrator | ok: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152227 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152232 | orchestrator | ok: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152236 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152241 | orchestrator | ok: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152248 | orchestrator | ok: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152253 | orchestrator | ok: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152262 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152268 | orchestrator | ok: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152273 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152278 | orchestrator | ok: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152282 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152291 | orchestrator | ok: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152296 | orchestrator | ok: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152302 | orchestrator | ok: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152307 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152313 | orchestrator | ok: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152318 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152323 | orchestrator | ok: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152327 | orchestrator | ok: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152335 | orchestrator | ok: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152339 | orchestrator | 2026-04-20 00:48:20.152344 | orchestrator | TASK [common : Copy rabbitmq-env.conf to kolla toolbox] ************************ 2026-04-20 00:48:20.152348 | orchestrator | Monday 20 April 2026 00:47:56 +0000 (0:00:02.912) 0:00:55.049 ********** 2026-04-20 00:48:20.152352 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152357 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152361 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152365 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152369 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152374 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152381 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2026-04-20 00:48:20.152385 | orchestrator | 2026-04-20 00:48:20.152389 | orchestrator | TASK [common : Copy rabbitmq erl_inetrc to kolla toolbox] ********************** 2026-04-20 00:48:20.152394 | orchestrator | Monday 20 April 2026 00:47:58 +0000 (0:00:02.449) 0:00:57.499 ********** 2026-04-20 00:48:20.152398 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152402 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152407 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152411 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152415 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152419 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152424 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2026-04-20 00:48:20.152428 | orchestrator | 2026-04-20 00:48:20.152434 | orchestrator | TASK [service-check-containers : common | Check containers] ******************** 2026-04-20 00:48:20.152439 | orchestrator | Monday 20 April 2026 00:48:03 +0000 (0:00:04.109) 0:01:01.608 ********** 2026-04-20 00:48:20.152443 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152448 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152455 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152460 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152464 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152471 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152478 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152483 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152487 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152495 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152499 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152505 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152513 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152617 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2026-04-20 00:48:20.152641 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152646 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152658 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152663 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152667 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152672 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152682 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:48:20.152687 | orchestrator | 2026-04-20 00:48:20.152692 | orchestrator | TASK [service-check-containers : common | Notify handlers to restart containers] *** 2026-04-20 00:48:20.152696 | orchestrator | Monday 20 April 2026 00:48:08 +0000 (0:00:05.119) 0:01:06.728 ********** 2026-04-20 00:48:20.152701 | orchestrator | changed: [testbed-manager] => { 2026-04-20 00:48:20.152705 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152710 | orchestrator | } 2026-04-20 00:48:20.152714 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:48:20.152718 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152723 | orchestrator | } 2026-04-20 00:48:20.152727 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:48:20.152732 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152736 | orchestrator | } 2026-04-20 00:48:20.152740 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:48:20.152745 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152749 | orchestrator | } 2026-04-20 00:48:20.152753 | orchestrator | changed: [testbed-node-3] => { 2026-04-20 00:48:20.152760 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152765 | orchestrator | } 2026-04-20 00:48:20.152769 | orchestrator | changed: [testbed-node-4] => { 2026-04-20 00:48:20.152773 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152777 | orchestrator | } 2026-04-20 00:48:20.152782 | orchestrator | changed: [testbed-node-5] => { 2026-04-20 00:48:20.152788 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:20.152793 | orchestrator | } 2026-04-20 00:48:20.152797 | orchestrator | 2026-04-20 00:48:20.152801 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:48:20.152806 | orchestrator | Monday 20 April 2026 00:48:09 +0000 (0:00:00.868) 0:01:07.597 ********** 2026-04-20 00:48:20.152810 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152815 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152819 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152824 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152829 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152837 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152844 | orchestrator | skipping: [testbed-manager] 2026-04-20 00:48:20.152849 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152854 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152858 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152866 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152871 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152876 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152880 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:20.152885 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:20.152892 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152900 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:20.152904 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152911 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152915 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:20.152920 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152924 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152929 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152933 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:20.152938 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/fluentd:5.0.9.20260328', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2026-04-20 00:48:20.152943 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/kolla-toolbox:20.3.1.20260328', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152953 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cron:3.0.20260328', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:48:20.152957 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:20.152962 | orchestrator | 2026-04-20 00:48:20.152966 | orchestrator | TASK [common : Creating log volume] ******************************************** 2026-04-20 00:48:20.153022 | orchestrator | Monday 20 April 2026 00:48:10 +0000 (0:00:01.796) 0:01:09.393 ********** 2026-04-20 00:48:20.153027 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.153031 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:20.153035 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:20.153039 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:20.153043 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:20.153047 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:20.153051 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:20.153055 | orchestrator | 2026-04-20 00:48:20.153062 | orchestrator | TASK [common : Link kolla_logs volume to /var/log/kolla] *********************** 2026-04-20 00:48:20.153066 | orchestrator | Monday 20 April 2026 00:48:12 +0000 (0:00:01.758) 0:01:11.152 ********** 2026-04-20 00:48:20.153070 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:20.153074 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:20.153078 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:20.153082 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:20.153086 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:20.153090 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:20.153095 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:20.153099 | orchestrator | 2026-04-20 00:48:20.153103 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153107 | orchestrator | Monday 20 April 2026 00:48:13 +0000 (0:00:01.343) 0:01:12.495 ********** 2026-04-20 00:48:20.153111 | orchestrator | 2026-04-20 00:48:20.153115 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153119 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.068) 0:01:12.564 ********** 2026-04-20 00:48:20.153123 | orchestrator | 2026-04-20 00:48:20.153127 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153131 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.058) 0:01:12.623 ********** 2026-04-20 00:48:20.153135 | orchestrator | 2026-04-20 00:48:20.153139 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153144 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.058) 0:01:12.681 ********** 2026-04-20 00:48:20.153148 | orchestrator | 2026-04-20 00:48:20.153152 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153156 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.078) 0:01:12.759 ********** 2026-04-20 00:48:20.153160 | orchestrator | 2026-04-20 00:48:20.153164 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153168 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.072) 0:01:12.831 ********** 2026-04-20 00:48:20.153172 | orchestrator | 2026-04-20 00:48:20.153176 | orchestrator | TASK [common : Flush handlers] ************************************************* 2026-04-20 00:48:20.153180 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.061) 0:01:12.893 ********** 2026-04-20 00:48:20.153187 | orchestrator | 2026-04-20 00:48:20.153192 | orchestrator | RUNNING HANDLER [common : Restart fluentd container] *************************** 2026-04-20 00:48:20.153196 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.086) 0:01:12.979 ********** 2026-04-20 00:48:20.153208 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_bcoi0zcc/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_bcoi0zcc/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_bcoi0zcc/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_bcoi0zcc/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153214 | orchestrator | fatal: [testbed-manager]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_wg86pkw3/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_wg86pkw3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_wg86pkw3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_wg86pkw3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153233 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_5krdo9kh/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_5krdo9kh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_5krdo9kh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_5krdo9kh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153246 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_ln7kdwjg/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_ln7kdwjg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_ln7kdwjg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_ln7kdwjg/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153256 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_fud0aq71/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_fud0aq71/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_fud0aq71/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_fud0aq71/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153270 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_eq2mkk46/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_eq2mkk46/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_eq2mkk46/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_eq2mkk46/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153275 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_rpxaj4rm/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_rpxaj4rm/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_rpxaj4rm/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_rpxaj4rm/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=5.0.9.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Ffluentd: Internal Server Error (\"unknown: repository kolla/release/2024.2/fluentd not found\")\\n'"} 2026-04-20 00:48:20.153282 | orchestrator | 2026-04-20 00:48:20.153287 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:20.153291 | orchestrator | testbed-manager : ok=20  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153296 | orchestrator | testbed-node-0 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153303 | orchestrator | testbed-node-1 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153307 | orchestrator | testbed-node-2 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153312 | orchestrator | testbed-node-3 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153316 | orchestrator | testbed-node-4 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153320 | orchestrator | testbed-node-5 : ok=16  changed=13  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:48:20.153325 | orchestrator | 2026-04-20 00:48:20.153329 | orchestrator | 2026-04-20 00:48:20.153335 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:20.153340 | orchestrator | Monday 20 April 2026 00:48:17 +0000 (0:00:03.442) 0:01:16.422 ********** 2026-04-20 00:48:20.153344 | orchestrator | =============================================================================== 2026-04-20 00:48:20.153348 | orchestrator | common : Copying over fluentd.conf -------------------------------------- 6.15s 2026-04-20 00:48:20.153352 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 6.15s 2026-04-20 00:48:20.153356 | orchestrator | common : Copying over config.json files for services -------------------- 5.88s 2026-04-20 00:48:20.153361 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 5.82s 2026-04-20 00:48:20.153365 | orchestrator | service-check-containers : common | Check containers -------------------- 5.12s 2026-04-20 00:48:20.153369 | orchestrator | common : Ensuring config directories exist ------------------------------ 4.27s 2026-04-20 00:48:20.153376 | orchestrator | common : Copy rabbitmq erl_inetrc to kolla toolbox ---------------------- 4.11s 2026-04-20 00:48:20.153380 | orchestrator | common : Copying over cron logrotate config file ------------------------ 3.60s 2026-04-20 00:48:20.153384 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 3.50s 2026-04-20 00:48:20.153388 | orchestrator | common : Restart fluentd container -------------------------------------- 3.44s 2026-04-20 00:48:20.153392 | orchestrator | common : Ensure RabbitMQ Erlang cookie exists --------------------------- 3.26s 2026-04-20 00:48:20.153396 | orchestrator | common : Ensuring config directories have correct owner and permission --- 2.91s 2026-04-20 00:48:20.153400 | orchestrator | common : Copy rabbitmq-env.conf to kolla toolbox ------------------------ 2.45s 2026-04-20 00:48:20.153405 | orchestrator | common : Copying over kolla.target -------------------------------------- 2.27s 2026-04-20 00:48:20.153409 | orchestrator | common : Ensure /var/log/journal exists on EL10 systems ----------------- 1.95s 2026-04-20 00:48:20.153413 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.80s 2026-04-20 00:48:20.153417 | orchestrator | common : Creating log volume -------------------------------------------- 1.76s 2026-04-20 00:48:20.153421 | orchestrator | common : include_tasks -------------------------------------------------- 1.43s 2026-04-20 00:48:20.153425 | orchestrator | common : Link kolla_logs volume to /var/log/kolla ----------------------- 1.34s 2026-04-20 00:48:20.153429 | orchestrator | common : include_tasks -------------------------------------------------- 1.22s 2026-04-20 00:48:20.153433 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:20.153841 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:20.154455 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:20.157501 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state STARTED 2026-04-20 00:48:20.158155 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:20.158830 | orchestrator | 2026-04-20 00:48:20 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:20.158891 | orchestrator | 2026-04-20 00:48:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:23.191733 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:23.191866 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task dc34de65-0c95-464d-8bc3-a124a6469e3e is in state STARTED 2026-04-20 00:48:23.192453 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:23.193151 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:23.193641 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:23.197391 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state STARTED 2026-04-20 00:48:23.197847 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:23.198507 | orchestrator | 2026-04-20 00:48:23 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:23.198633 | orchestrator | 2026-04-20 00:48:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:26.228918 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:26.229360 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task dc34de65-0c95-464d-8bc3-a124a6469e3e is in state STARTED 2026-04-20 00:48:26.230247 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:26.230839 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:26.233289 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:26.233690 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state STARTED 2026-04-20 00:48:26.235251 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:26.238732 | orchestrator | 2026-04-20 00:48:26 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state STARTED 2026-04-20 00:48:26.238803 | orchestrator | 2026-04-20 00:48:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:29.407735 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:29.407864 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task dc34de65-0c95-464d-8bc3-a124a6469e3e is in state STARTED 2026-04-20 00:48:29.407876 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:29.407884 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state STARTED 2026-04-20 00:48:29.407890 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:29.407897 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state STARTED 2026-04-20 00:48:29.408337 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:29.409719 | orchestrator | 2026-04-20 00:48:29.409788 | orchestrator | 2026-04-20 00:48:29.409798 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:48:29.409806 | orchestrator | 2026-04-20 00:48:29.409813 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:48:29.409819 | orchestrator | Monday 20 April 2026 00:47:09 +0000 (0:00:00.979) 0:00:00.979 ********** 2026-04-20 00:48:29.409827 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2026-04-20 00:48:29.409834 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2026-04-20 00:48:29.409840 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2026-04-20 00:48:29.409846 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2026-04-20 00:48:29.409856 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2026-04-20 00:48:29.409869 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2026-04-20 00:48:29.409883 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2026-04-20 00:48:29.409894 | orchestrator | 2026-04-20 00:48:29.409905 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2026-04-20 00:48:29.409915 | orchestrator | 2026-04-20 00:48:29.409925 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2026-04-20 00:48:29.409935 | orchestrator | Monday 20 April 2026 00:47:11 +0000 (0:00:01.988) 0:00:02.968 ********** 2026-04-20 00:48:29.409949 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:48:29.409962 | orchestrator | 2026-04-20 00:48:29.409974 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2026-04-20 00:48:29.409986 | orchestrator | Monday 20 April 2026 00:47:12 +0000 (0:00:01.126) 0:00:04.095 ********** 2026-04-20 00:48:29.410084 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:29.410097 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:29.410104 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:29.410111 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:48:29.410117 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:29.410123 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:48:29.410129 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:48:29.410135 | orchestrator | 2026-04-20 00:48:29.410142 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2026-04-20 00:48:29.410149 | orchestrator | Monday 20 April 2026 00:47:15 +0000 (0:00:03.256) 0:00:07.351 ********** 2026-04-20 00:48:29.410155 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:29.410161 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:29.410167 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:29.410173 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:48:29.410179 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:29.410186 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:48:29.410192 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:48:29.410198 | orchestrator | 2026-04-20 00:48:29.410204 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2026-04-20 00:48:29.410210 | orchestrator | Monday 20 April 2026 00:47:19 +0000 (0:00:04.143) 0:00:11.495 ********** 2026-04-20 00:48:29.410217 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:29.410223 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:29.410229 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:29.410235 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:29.410241 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:29.410247 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:29.410253 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:29.410259 | orchestrator | 2026-04-20 00:48:29.410266 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2026-04-20 00:48:29.410273 | orchestrator | Monday 20 April 2026 00:47:21 +0000 (0:00:02.043) 0:00:13.539 ********** 2026-04-20 00:48:29.410280 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:29.410287 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:29.410295 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:29.410302 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:29.410309 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:29.410315 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:29.410322 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:29.410329 | orchestrator | 2026-04-20 00:48:29.410336 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2026-04-20 00:48:29.410343 | orchestrator | Monday 20 April 2026 00:47:31 +0000 (0:00:09.934) 0:00:23.473 ********** 2026-04-20 00:48:29.410351 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:29.410358 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:29.410364 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:29.410372 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:29.410379 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:29.410386 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:29.410393 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:29.410400 | orchestrator | 2026-04-20 00:48:29.410407 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2026-04-20 00:48:29.410414 | orchestrator | Monday 20 April 2026 00:47:59 +0000 (0:00:27.452) 0:00:50.926 ********** 2026-04-20 00:48:29.410423 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:48:29.410433 | orchestrator | 2026-04-20 00:48:29.410440 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2026-04-20 00:48:29.410447 | orchestrator | Monday 20 April 2026 00:48:01 +0000 (0:00:01.787) 0:00:52.713 ********** 2026-04-20 00:48:29.410459 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2026-04-20 00:48:29.410467 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2026-04-20 00:48:29.410473 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2026-04-20 00:48:29.410479 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2026-04-20 00:48:29.410505 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2026-04-20 00:48:29.410512 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2026-04-20 00:48:29.410558 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2026-04-20 00:48:29.410566 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2026-04-20 00:48:29.410572 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2026-04-20 00:48:29.410579 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2026-04-20 00:48:29.410588 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2026-04-20 00:48:29.410598 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2026-04-20 00:48:29.410609 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2026-04-20 00:48:29.410618 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2026-04-20 00:48:29.410629 | orchestrator | 2026-04-20 00:48:29.410638 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2026-04-20 00:48:29.410650 | orchestrator | Monday 20 April 2026 00:48:06 +0000 (0:00:05.585) 0:00:58.299 ********** 2026-04-20 00:48:29.410660 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:29.410669 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:29.410679 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:29.410689 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:29.410698 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:48:29.410707 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:48:29.410717 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:48:29.410726 | orchestrator | 2026-04-20 00:48:29.410737 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2026-04-20 00:48:29.410747 | orchestrator | Monday 20 April 2026 00:48:08 +0000 (0:00:01.449) 0:00:59.748 ********** 2026-04-20 00:48:29.410757 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:29.410767 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:29.410777 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:29.410787 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:29.410798 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:29.410809 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:29.410819 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:29.410830 | orchestrator | 2026-04-20 00:48:29.410840 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2026-04-20 00:48:29.410850 | orchestrator | Monday 20 April 2026 00:48:09 +0000 (0:00:01.382) 0:01:01.131 ********** 2026-04-20 00:48:29.410860 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:29.410866 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:29.410872 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:29.410879 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:29.410885 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:48:29.410891 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:48:29.410897 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:48:29.410903 | orchestrator | 2026-04-20 00:48:29.410910 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2026-04-20 00:48:29.410916 | orchestrator | Monday 20 April 2026 00:48:11 +0000 (0:00:01.684) 0:01:02.815 ********** 2026-04-20 00:48:29.410922 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:29.410928 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:29.410934 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:29.410940 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:29.410946 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:48:29.410952 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:48:29.410958 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:48:29.410964 | orchestrator | 2026-04-20 00:48:29.410970 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2026-04-20 00:48:29.410983 | orchestrator | Monday 20 April 2026 00:48:12 +0000 (0:00:01.452) 0:01:04.267 ********** 2026-04-20 00:48:29.410990 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2026-04-20 00:48:29.411007 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:48:29.411021 | orchestrator | 2026-04-20 00:48:29.411035 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2026-04-20 00:48:29.411045 | orchestrator | Monday 20 April 2026 00:48:13 +0000 (0:00:01.136) 0:01:05.404 ********** 2026-04-20 00:48:29.411055 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:29.411065 | orchestrator | 2026-04-20 00:48:29.411074 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2026-04-20 00:48:29.411085 | orchestrator | Monday 20 April 2026 00:48:15 +0000 (0:00:01.512) 0:01:06.917 ********** 2026-04-20 00:48:29.411093 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:48:29.411102 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:48:29.411111 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:48:29.411120 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:48:29.411130 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:48:29.411139 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:48:29.411149 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:29.411159 | orchestrator | 2026-04-20 00:48:29.411181 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:29.411192 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411205 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411214 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411220 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411237 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411244 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411251 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:29.411257 | orchestrator | 2026-04-20 00:48:29.411263 | orchestrator | 2026-04-20 00:48:29.411269 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:29.411276 | orchestrator | Monday 20 April 2026 00:48:26 +0000 (0:00:11.502) 0:01:18.419 ********** 2026-04-20 00:48:29.411282 | orchestrator | =============================================================================== 2026-04-20 00:48:29.411288 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 27.45s 2026-04-20 00:48:29.411294 | orchestrator | osism.services.netdata : Restart service netdata ----------------------- 11.50s 2026-04-20 00:48:29.411301 | orchestrator | osism.services.netdata : Add repository --------------------------------- 9.93s 2026-04-20 00:48:29.411307 | orchestrator | osism.services.netdata : Copy configuration files ----------------------- 5.59s 2026-04-20 00:48:29.411313 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 4.14s 2026-04-20 00:48:29.411319 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 3.26s 2026-04-20 00:48:29.411326 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 2.04s 2026-04-20 00:48:29.411341 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.99s 2026-04-20 00:48:29.411347 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 1.79s 2026-04-20 00:48:29.411353 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 1.68s 2026-04-20 00:48:29.411360 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 1.51s 2026-04-20 00:48:29.411366 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 1.45s 2026-04-20 00:48:29.411372 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 1.45s 2026-04-20 00:48:29.411378 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 1.38s 2026-04-20 00:48:29.411385 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 1.14s 2026-04-20 00:48:29.411391 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 1.13s 2026-04-20 00:48:29.411398 | orchestrator | 2026-04-20 00:48:29 | INFO  | Task 1d04d84e-203a-4d0d-8276-972517871f3e is in state SUCCESS 2026-04-20 00:48:29.411405 | orchestrator | 2026-04-20 00:48:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:32.454383 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:32.454658 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task dc34de65-0c95-464d-8bc3-a124a6469e3e is in state STARTED 2026-04-20 00:48:32.455752 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:32.455908 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task a10ca595-c026-4255-baf9-f76e89c2d81f is in state SUCCESS 2026-04-20 00:48:32.458094 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:32.458928 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state STARTED 2026-04-20 00:48:32.462410 | orchestrator | 2026-04-20 00:48:32 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:32.462473 | orchestrator | 2026-04-20 00:48:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:35.501297 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:35.501430 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task dc34de65-0c95-464d-8bc3-a124a6469e3e is in state SUCCESS 2026-04-20 00:48:35.502508 | orchestrator | 2026-04-20 00:48:35.502601 | orchestrator | 2026-04-20 00:48:35.502612 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2026-04-20 00:48:35.502621 | orchestrator | 2026-04-20 00:48:35.502628 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2026-04-20 00:48:35.502634 | orchestrator | Monday 20 April 2026 00:47:26 +0000 (0:00:00.308) 0:00:00.308 ********** 2026-04-20 00:48:35.502638 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:35.502650 | orchestrator | 2026-04-20 00:48:35.502655 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2026-04-20 00:48:35.502659 | orchestrator | Monday 20 April 2026 00:47:28 +0000 (0:00:01.875) 0:00:02.184 ********** 2026-04-20 00:48:35.502664 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2026-04-20 00:48:35.502668 | orchestrator | 2026-04-20 00:48:35.502672 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2026-04-20 00:48:35.502677 | orchestrator | Monday 20 April 2026 00:47:29 +0000 (0:00:00.936) 0:00:03.120 ********** 2026-04-20 00:48:35.502681 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:35.502685 | orchestrator | 2026-04-20 00:48:35.502689 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2026-04-20 00:48:35.502714 | orchestrator | Monday 20 April 2026 00:47:31 +0000 (0:00:01.754) 0:00:04.875 ********** 2026-04-20 00:48:35.502719 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2026-04-20 00:48:35.502723 | orchestrator | ok: [testbed-manager] 2026-04-20 00:48:35.502727 | orchestrator | 2026-04-20 00:48:35.502731 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2026-04-20 00:48:35.502735 | orchestrator | Monday 20 April 2026 00:48:27 +0000 (0:00:56.734) 0:01:01.609 ********** 2026-04-20 00:48:35.502739 | orchestrator | changed: [testbed-manager] 2026-04-20 00:48:35.502743 | orchestrator | 2026-04-20 00:48:35.502747 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:35.502752 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:48:35.502758 | orchestrator | 2026-04-20 00:48:35.502762 | orchestrator | 2026-04-20 00:48:35.502765 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:35.502769 | orchestrator | Monday 20 April 2026 00:48:30 +0000 (0:00:03.159) 0:01:04.768 ********** 2026-04-20 00:48:35.502773 | orchestrator | =============================================================================== 2026-04-20 00:48:35.502777 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 56.73s 2026-04-20 00:48:35.502781 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 3.16s 2026-04-20 00:48:35.502785 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 1.88s 2026-04-20 00:48:35.502789 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 1.75s 2026-04-20 00:48:35.502793 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 0.94s 2026-04-20 00:48:35.502797 | orchestrator | 2026-04-20 00:48:35.502801 | orchestrator | 2026-04-20 00:48:35.502805 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:48:35.502809 | orchestrator | 2026-04-20 00:48:35.502812 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:48:35.502816 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.266) 0:00:00.266 ********** 2026-04-20 00:48:35.502820 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:35.502824 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:35.502828 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:35.502832 | orchestrator | 2026-04-20 00:48:35.502836 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:48:35.502840 | orchestrator | Monday 20 April 2026 00:48:23 +0000 (0:00:00.385) 0:00:00.651 ********** 2026-04-20 00:48:35.502844 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2026-04-20 00:48:35.502849 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2026-04-20 00:48:35.502853 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2026-04-20 00:48:35.502857 | orchestrator | 2026-04-20 00:48:35.502861 | orchestrator | PLAY [Apply role memcached] **************************************************** 2026-04-20 00:48:35.502865 | orchestrator | 2026-04-20 00:48:35.502869 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2026-04-20 00:48:35.502873 | orchestrator | Monday 20 April 2026 00:48:23 +0000 (0:00:00.430) 0:00:01.081 ********** 2026-04-20 00:48:35.502877 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:48:35.502882 | orchestrator | 2026-04-20 00:48:35.502886 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2026-04-20 00:48:35.502890 | orchestrator | Monday 20 April 2026 00:48:23 +0000 (0:00:00.406) 0:00:01.488 ********** 2026-04-20 00:48:35.502894 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2026-04-20 00:48:35.502910 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2026-04-20 00:48:35.502975 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2026-04-20 00:48:35.502982 | orchestrator | 2026-04-20 00:48:35.502986 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2026-04-20 00:48:35.502995 | orchestrator | Monday 20 April 2026 00:48:25 +0000 (0:00:01.434) 0:00:02.923 ********** 2026-04-20 00:48:35.502999 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2026-04-20 00:48:35.503003 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2026-04-20 00:48:35.503007 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2026-04-20 00:48:35.503011 | orchestrator | 2026-04-20 00:48:35.503015 | orchestrator | TASK [service-check-containers : memcached | Check containers] ***************** 2026-04-20 00:48:35.503019 | orchestrator | Monday 20 April 2026 00:48:27 +0000 (0:00:01.937) 0:00:04.860 ********** 2026-04-20 00:48:35.503038 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-20 00:48:35.503045 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-20 00:48:35.503050 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-20 00:48:35.503054 | orchestrator | 2026-04-20 00:48:35.503058 | orchestrator | TASK [service-check-containers : memcached | Notify handlers to restart containers] *** 2026-04-20 00:48:35.503062 | orchestrator | Monday 20 April 2026 00:48:28 +0000 (0:00:01.296) 0:00:06.157 ********** 2026-04-20 00:48:35.503066 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:48:35.503070 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:35.503074 | orchestrator | } 2026-04-20 00:48:35.503078 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:48:35.503082 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:35.503086 | orchestrator | } 2026-04-20 00:48:35.503090 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:48:35.503094 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:35.503098 | orchestrator | } 2026-04-20 00:48:35.503102 | orchestrator | 2026-04-20 00:48:35.503106 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:48:35.503110 | orchestrator | Monday 20 April 2026 00:48:29 +0000 (0:00:00.676) 0:00:06.833 ********** 2026-04-20 00:48:35.503124 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-20 00:48:35.503128 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:35.503142 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-20 00:48:35.503147 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:35.503151 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-20 00:48:35.503155 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:35.503159 | orchestrator | 2026-04-20 00:48:35.503163 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2026-04-20 00:48:35.503167 | orchestrator | Monday 20 April 2026 00:48:30 +0000 (0:00:01.442) 0:00:08.276 ********** 2026-04-20 00:48:35.503175 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmemcached\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_rk89csft/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_rk89csft/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_rk89csft/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_rk89csft/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmemcached: Internal Server Error (\"unknown: repository kolla/release/2024.2/memcached not found\")\\n'"} 2026-04-20 00:48:35.503191 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmemcached\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_56y_b5hr/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_56y_b5hr/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_56y_b5hr/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_56y_b5hr/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmemcached: Internal Server Error (\"unknown: repository kolla/release/2024.2/memcached not found\")\\n'"} 2026-04-20 00:48:35.503204 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmemcached\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_dytc5pdy/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_dytc5pdy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_dytc5pdy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_dytc5pdy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.6.24.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmemcached: Internal Server Error (\"unknown: repository kolla/release/2024.2/memcached not found\")\\n'"} 2026-04-20 00:48:35.503212 | orchestrator | 2026-04-20 00:48:35.503216 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:35.503222 | orchestrator | testbed-node-0 : ok=7  changed=4  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:48:35.503227 | orchestrator | testbed-node-1 : ok=7  changed=4  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:48:35.503234 | orchestrator | testbed-node-2 : ok=7  changed=4  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:48:35.503240 | orchestrator | 2026-04-20 00:48:35.503246 | orchestrator | 2026-04-20 00:48:35.503250 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:35.503254 | orchestrator | Monday 20 April 2026 00:48:32 +0000 (0:00:01.905) 0:00:10.181 ********** 2026-04-20 00:48:35.503257 | orchestrator | =============================================================================== 2026-04-20 00:48:35.503261 | orchestrator | memcached : Copying over config.json files for services ----------------- 1.94s 2026-04-20 00:48:35.503265 | orchestrator | memcached : Restart memcached container --------------------------------- 1.90s 2026-04-20 00:48:35.503269 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.44s 2026-04-20 00:48:35.503273 | orchestrator | memcached : Ensuring config directories exist --------------------------- 1.43s 2026-04-20 00:48:35.503277 | orchestrator | service-check-containers : memcached | Check containers ----------------- 1.30s 2026-04-20 00:48:35.503281 | orchestrator | service-check-containers : memcached | Notify handlers to restart containers --- 0.68s 2026-04-20 00:48:35.503292 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.43s 2026-04-20 00:48:35.503296 | orchestrator | memcached : include_tasks ----------------------------------------------- 0.41s 2026-04-20 00:48:35.503300 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.39s 2026-04-20 00:48:35.503304 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:35.503308 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:35.503757 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state STARTED 2026-04-20 00:48:35.504226 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:35.505700 | orchestrator | 2026-04-20 00:48:35 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:35.505749 | orchestrator | 2026-04-20 00:48:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:38.534536 | orchestrator | 2026-04-20 00:48:38 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:38.540809 | orchestrator | 2026-04-20 00:48:38 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:38.547178 | orchestrator | 2026-04-20 00:48:38 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:38.551040 | orchestrator | 2026-04-20 00:48:38 | INFO  | Task 4e5ce30a-0f4d-4e63-b13c-b4134be4ff6c is in state SUCCESS 2026-04-20 00:48:38.551985 | orchestrator | 2026-04-20 00:48:38.552034 | orchestrator | 2026-04-20 00:48:38.552059 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:48:38.552069 | orchestrator | 2026-04-20 00:48:38.552076 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:48:38.552084 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.348) 0:00:00.348 ********** 2026-04-20 00:48:38.552091 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:38.552099 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:38.552106 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:38.552113 | orchestrator | 2026-04-20 00:48:38.552119 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:48:38.552126 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.323) 0:00:00.672 ********** 2026-04-20 00:48:38.552133 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2026-04-20 00:48:38.552141 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2026-04-20 00:48:38.552147 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2026-04-20 00:48:38.552154 | orchestrator | 2026-04-20 00:48:38.552160 | orchestrator | PLAY [Apply role redis] ******************************************************** 2026-04-20 00:48:38.552167 | orchestrator | 2026-04-20 00:48:38.552173 | orchestrator | TASK [redis : include_tasks] *************************************************** 2026-04-20 00:48:38.552180 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.267) 0:00:00.940 ********** 2026-04-20 00:48:38.552187 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:48:38.552193 | orchestrator | 2026-04-20 00:48:38.552200 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2026-04-20 00:48:38.552206 | orchestrator | Monday 20 April 2026 00:48:23 +0000 (0:00:01.024) 0:00:01.964 ********** 2026-04-20 00:48:38.552215 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552251 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552259 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552267 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552293 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552301 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552308 | orchestrator | 2026-04-20 00:48:38.552322 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2026-04-20 00:48:38.552328 | orchestrator | Monday 20 April 2026 00:48:26 +0000 (0:00:02.224) 0:00:04.189 ********** 2026-04-20 00:48:38.552335 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552347 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552354 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552360 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552377 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552385 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552391 | orchestrator | 2026-04-20 00:48:38.552397 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2026-04-20 00:48:38.552404 | orchestrator | Monday 20 April 2026 00:48:29 +0000 (0:00:03.087) 0:00:07.277 ********** 2026-04-20 00:48:38.552416 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552423 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552430 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552438 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552448 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552456 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552463 | orchestrator | 2026-04-20 00:48:38.552470 | orchestrator | TASK [service-check-containers : redis | Check containers] ********************* 2026-04-20 00:48:38.552480 | orchestrator | Monday 20 April 2026 00:48:32 +0000 (0:00:02.956) 0:00:10.234 ********** 2026-04-20 00:48:38.552486 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552493 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552505 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552512 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552525 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552533 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2026-04-20 00:48:38.552571 | orchestrator | 2026-04-20 00:48:38.552578 | orchestrator | TASK [service-check-containers : redis | Notify handlers to restart containers] *** 2026-04-20 00:48:38.552585 | orchestrator | Monday 20 April 2026 00:48:34 +0000 (0:00:01.984) 0:00:12.219 ********** 2026-04-20 00:48:38.552591 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:48:38.552599 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:38.552605 | orchestrator | } 2026-04-20 00:48:38.552612 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:48:38.552618 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:38.552625 | orchestrator | } 2026-04-20 00:48:38.552631 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:48:38.552637 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:38.552644 | orchestrator | } 2026-04-20 00:48:38.552650 | orchestrator | 2026-04-20 00:48:38.552656 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:48:38.552662 | orchestrator | Monday 20 April 2026 00:48:34 +0000 (0:00:00.609) 0:00:12.828 ********** 2026-04-20 00:48:38.552669 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-04-20 00:48:38.552677 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-04-20 00:48:38.552685 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:38.552692 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-04-20 00:48:38.552699 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-04-20 00:48:38.552707 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:38.552728 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis:7.0.15.20260328', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}})  2026-04-20 00:48:38.552738 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/redis-sentinel:7.0.15.20260328', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}})  2026-04-20 00:48:38.552743 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:38.552748 | orchestrator | 2026-04-20 00:48:38.552752 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-04-20 00:48:38.552756 | orchestrator | Monday 20 April 2026 00:48:35 +0000 (0:00:00.919) 0:00:13.748 ********** 2026-04-20 00:48:38.552761 | orchestrator | 2026-04-20 00:48:38.552765 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-04-20 00:48:38.552769 | orchestrator | Monday 20 April 2026 00:48:35 +0000 (0:00:00.064) 0:00:13.813 ********** 2026-04-20 00:48:38.552774 | orchestrator | 2026-04-20 00:48:38.552778 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2026-04-20 00:48:38.552782 | orchestrator | Monday 20 April 2026 00:48:35 +0000 (0:00:00.065) 0:00:13.879 ********** 2026-04-20 00:48:38.552787 | orchestrator | 2026-04-20 00:48:38.552791 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2026-04-20 00:48:38.552795 | orchestrator | Monday 20 April 2026 00:48:35 +0000 (0:00:00.066) 0:00:13.945 ********** 2026-04-20 00:48:38.552814 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fredis\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_hwdgk6ym/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_hwdgk6ym/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_hwdgk6ym/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_hwdgk6ym/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fredis: Internal Server Error (\"unknown: repository kolla/release/2024.2/redis not found\")\\n'"} 2026-04-20 00:48:38.552826 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fredis\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_ipdyfoc_/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_ipdyfoc_/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_ipdyfoc_/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_ipdyfoc_/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fredis: Internal Server Error (\"unknown: repository kolla/release/2024.2/redis not found\")\\n'"} 2026-04-20 00:48:38.552839 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fredis\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_cvmrajr8/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_cvmrajr8/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_cvmrajr8/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_cvmrajr8/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=7.0.15.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fredis: Internal Server Error (\"unknown: repository kolla/release/2024.2/redis not found\")\\n'"} 2026-04-20 00:48:38.552848 | orchestrator | 2026-04-20 00:48:38.552853 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:38.552859 | orchestrator | testbed-node-0 : ok=8  changed=5  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:48:38.552864 | orchestrator | testbed-node-1 : ok=8  changed=5  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:48:38.552868 | orchestrator | testbed-node-2 : ok=8  changed=5  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:48:38.552873 | orchestrator | 2026-04-20 00:48:38.552877 | orchestrator | 2026-04-20 00:48:38.552881 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:38.552886 | orchestrator | Monday 20 April 2026 00:48:37 +0000 (0:00:02.124) 0:00:16.069 ********** 2026-04-20 00:48:38.552890 | orchestrator | =============================================================================== 2026-04-20 00:48:38.552895 | orchestrator | redis : Copying over default config.json files -------------------------- 3.09s 2026-04-20 00:48:38.552899 | orchestrator | redis : Copying over redis config files --------------------------------- 2.96s 2026-04-20 00:48:38.552904 | orchestrator | redis : Ensuring config directories exist ------------------------------- 2.22s 2026-04-20 00:48:38.552908 | orchestrator | redis : Restart redis container ----------------------------------------- 2.12s 2026-04-20 00:48:38.552912 | orchestrator | service-check-containers : redis | Check containers --------------------- 1.98s 2026-04-20 00:48:38.552916 | orchestrator | redis : include_tasks --------------------------------------------------- 1.02s 2026-04-20 00:48:38.552919 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.92s 2026-04-20 00:48:38.552923 | orchestrator | service-check-containers : redis | Notify handlers to restart containers --- 0.61s 2026-04-20 00:48:38.552927 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.32s 2026-04-20 00:48:38.552931 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.27s 2026-04-20 00:48:38.552935 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.20s 2026-04-20 00:48:38.553164 | orchestrator | 2026-04-20 00:48:38 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:38.556169 | orchestrator | 2026-04-20 00:48:38 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:38.556235 | orchestrator | 2026-04-20 00:48:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:41.649713 | orchestrator | 2026-04-20 00:48:41 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:41.652847 | orchestrator | 2026-04-20 00:48:41 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:41.655152 | orchestrator | 2026-04-20 00:48:41 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:41.658860 | orchestrator | 2026-04-20 00:48:41 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:41.659425 | orchestrator | 2026-04-20 00:48:41 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:41.659723 | orchestrator | 2026-04-20 00:48:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:44.780612 | orchestrator | 2026-04-20 00:48:44 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:44.780733 | orchestrator | 2026-04-20 00:48:44 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:44.782392 | orchestrator | 2026-04-20 00:48:44 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:44.785141 | orchestrator | 2026-04-20 00:48:44 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:44.787089 | orchestrator | 2026-04-20 00:48:44 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:44.787135 | orchestrator | 2026-04-20 00:48:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:47.820775 | orchestrator | 2026-04-20 00:48:47 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:47.821628 | orchestrator | 2026-04-20 00:48:47 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:47.822722 | orchestrator | 2026-04-20 00:48:47 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:47.823749 | orchestrator | 2026-04-20 00:48:47 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:47.825100 | orchestrator | 2026-04-20 00:48:47 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:47.825669 | orchestrator | 2026-04-20 00:48:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:50.870260 | orchestrator | 2026-04-20 00:48:50 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:50.871708 | orchestrator | 2026-04-20 00:48:50 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state STARTED 2026-04-20 00:48:50.871747 | orchestrator | 2026-04-20 00:48:50 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:50.873047 | orchestrator | 2026-04-20 00:48:50 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:50.874604 | orchestrator | 2026-04-20 00:48:50 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:50.874846 | orchestrator | 2026-04-20 00:48:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:53.908721 | orchestrator | 2026-04-20 00:48:53 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:53.909301 | orchestrator | 2026-04-20 00:48:53 | INFO  | Task ab15b63e-0890-48c3-8f96-2e38a1574dc6 is in state SUCCESS 2026-04-20 00:48:53.913618 | orchestrator | 2026-04-20 00:48:53.913686 | orchestrator | 2026-04-20 00:48:53.913693 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:48:53.913698 | orchestrator | 2026-04-20 00:48:53.913702 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:48:53.913707 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.328) 0:00:00.328 ********** 2026-04-20 00:48:53.913711 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:48:53.913716 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:48:53.913720 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:48:53.913724 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:48:53.913728 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:48:53.913732 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:48:53.913736 | orchestrator | 2026-04-20 00:48:53.913739 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:48:53.913743 | orchestrator | Monday 20 April 2026 00:48:23 +0000 (0:00:00.616) 0:00:00.944 ********** 2026-04-20 00:48:53.913747 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-20 00:48:53.913752 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-20 00:48:53.913756 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-20 00:48:53.913759 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-20 00:48:53.913763 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-20 00:48:53.913767 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2026-04-20 00:48:53.913770 | orchestrator | 2026-04-20 00:48:53.913774 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2026-04-20 00:48:53.913778 | orchestrator | 2026-04-20 00:48:53.913782 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2026-04-20 00:48:53.913785 | orchestrator | Monday 20 April 2026 00:48:24 +0000 (0:00:01.090) 0:00:02.035 ********** 2026-04-20 00:48:53.913794 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:48:53.913800 | orchestrator | 2026-04-20 00:48:53.913804 | orchestrator | TASK [module-load : Load modules] ********************************************** 2026-04-20 00:48:53.913808 | orchestrator | Monday 20 April 2026 00:48:25 +0000 (0:00:01.119) 0:00:03.154 ********** 2026-04-20 00:48:53.913811 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2026-04-20 00:48:53.913815 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2026-04-20 00:48:53.913819 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2026-04-20 00:48:53.913836 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2026-04-20 00:48:53.913840 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2026-04-20 00:48:53.913844 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2026-04-20 00:48:53.913847 | orchestrator | 2026-04-20 00:48:53.913851 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2026-04-20 00:48:53.913855 | orchestrator | Monday 20 April 2026 00:48:27 +0000 (0:00:02.206) 0:00:05.360 ********** 2026-04-20 00:48:53.913859 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2026-04-20 00:48:53.913863 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2026-04-20 00:48:53.913866 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2026-04-20 00:48:53.913870 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2026-04-20 00:48:53.913874 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2026-04-20 00:48:53.913877 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2026-04-20 00:48:53.913881 | orchestrator | 2026-04-20 00:48:53.913885 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2026-04-20 00:48:53.913902 | orchestrator | Monday 20 April 2026 00:48:30 +0000 (0:00:02.542) 0:00:07.902 ********** 2026-04-20 00:48:53.913906 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2026-04-20 00:48:53.913910 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:53.913914 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2026-04-20 00:48:53.913918 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:53.913922 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2026-04-20 00:48:53.913926 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:53.913930 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2026-04-20 00:48:53.913933 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:53.913937 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2026-04-20 00:48:53.913941 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:53.913944 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2026-04-20 00:48:53.913949 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:53.913952 | orchestrator | 2026-04-20 00:48:53.913956 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2026-04-20 00:48:53.913960 | orchestrator | Monday 20 April 2026 00:48:31 +0000 (0:00:01.582) 0:00:09.485 ********** 2026-04-20 00:48:53.913963 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:53.913967 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:53.913971 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:53.913974 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:53.913978 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:53.913982 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:53.913986 | orchestrator | 2026-04-20 00:48:53.913989 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2026-04-20 00:48:53.913993 | orchestrator | Monday 20 April 2026 00:48:32 +0000 (0:00:00.659) 0:00:10.144 ********** 2026-04-20 00:48:53.914009 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914046 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914055 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914064 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914068 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914078 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914083 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914087 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914094 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914106 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914110 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914120 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914124 | orchestrator | 2026-04-20 00:48:53.914128 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2026-04-20 00:48:53.914132 | orchestrator | Monday 20 April 2026 00:48:34 +0000 (0:00:01.853) 0:00:11.998 ********** 2026-04-20 00:48:53.914136 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914143 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914152 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914159 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914169 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914176 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914187 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914205 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914213 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914219 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914233 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914240 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914246 | orchestrator | 2026-04-20 00:48:53.914253 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2026-04-20 00:48:53.914260 | orchestrator | Monday 20 April 2026 00:48:37 +0000 (0:00:03.101) 0:00:15.099 ********** 2026-04-20 00:48:53.914270 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:53.914277 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:53.914283 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:53.914290 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:53.914297 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:53.914303 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:53.914309 | orchestrator | 2026-04-20 00:48:53.914316 | orchestrator | TASK [service-check-containers : openvswitch | Check containers] *************** 2026-04-20 00:48:53.914323 | orchestrator | Monday 20 April 2026 00:48:38 +0000 (0:00:00.874) 0:00:15.974 ********** 2026-04-20 00:48:53.914335 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914342 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914348 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914362 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914369 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914383 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914811 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914844 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914853 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914875 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914887 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914891 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2026-04-20 00:48:53.914895 | orchestrator | 2026-04-20 00:48:53.914900 | orchestrator | TASK [service-check-containers : openvswitch | Notify handlers to restart containers] *** 2026-04-20 00:48:53.914904 | orchestrator | Monday 20 April 2026 00:48:41 +0000 (0:00:02.951) 0:00:18.926 ********** 2026-04-20 00:48:53.914908 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:48:53.914913 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:53.914916 | orchestrator | } 2026-04-20 00:48:53.914920 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:48:53.914924 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:53.914928 | orchestrator | } 2026-04-20 00:48:53.914932 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:48:53.914935 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:53.914939 | orchestrator | } 2026-04-20 00:48:53.914943 | orchestrator | changed: [testbed-node-3] => { 2026-04-20 00:48:53.914946 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:53.914950 | orchestrator | } 2026-04-20 00:48:53.914954 | orchestrator | changed: [testbed-node-4] => { 2026-04-20 00:48:53.914957 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:53.914961 | orchestrator | } 2026-04-20 00:48:53.914965 | orchestrator | changed: [testbed-node-5] => { 2026-04-20 00:48:53.914969 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:48:53.914972 | orchestrator | } 2026-04-20 00:48:53.914976 | orchestrator | 2026-04-20 00:48:53.914980 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:48:53.914984 | orchestrator | Monday 20 April 2026 00:48:43 +0000 (0:00:01.940) 0:00:20.866 ********** 2026-04-20 00:48:53.914991 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-20 00:48:53.915004 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-20 00:48:53.915013 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-20 00:48:53.915017 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-20 00:48:53.915021 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:48:53.915025 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:48:53.915029 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-20 00:48:53.915039 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-20 00:48:53.915043 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:48:53.915047 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-20 00:48:53.915062 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-20 00:48:53.915069 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:48:53.915075 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-20 00:48:53.915081 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-20 00:48:53.915087 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:48:53.915093 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-db-server:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}})  2026-04-20 00:48:53.915103 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/openvswitch-vswitchd:3.5.1.20260328', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}})  2026-04-20 00:48:53.915114 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:48:53.915120 | orchestrator | 2026-04-20 00:48:53.915126 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-20 00:48:53.915132 | orchestrator | Monday 20 April 2026 00:48:46 +0000 (0:00:03.403) 0:00:24.270 ********** 2026-04-20 00:48:53.915138 | orchestrator | 2026-04-20 00:48:53.915144 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-20 00:48:53.915150 | orchestrator | Monday 20 April 2026 00:48:46 +0000 (0:00:00.418) 0:00:24.688 ********** 2026-04-20 00:48:53.915156 | orchestrator | 2026-04-20 00:48:53.915167 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-20 00:48:53.915174 | orchestrator | Monday 20 April 2026 00:48:47 +0000 (0:00:00.308) 0:00:24.996 ********** 2026-04-20 00:48:53.915180 | orchestrator | 2026-04-20 00:48:53.915186 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-20 00:48:53.915192 | orchestrator | Monday 20 April 2026 00:48:47 +0000 (0:00:00.305) 0:00:25.302 ********** 2026-04-20 00:48:53.915198 | orchestrator | 2026-04-20 00:48:53.915205 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-20 00:48:53.915211 | orchestrator | Monday 20 April 2026 00:48:47 +0000 (0:00:00.379) 0:00:25.682 ********** 2026-04-20 00:48:53.915217 | orchestrator | 2026-04-20 00:48:53.915222 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2026-04-20 00:48:53.915228 | orchestrator | Monday 20 April 2026 00:48:48 +0000 (0:00:00.341) 0:00:26.024 ********** 2026-04-20 00:48:53.915234 | orchestrator | 2026-04-20 00:48:53.915240 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2026-04-20 00:48:53.915244 | orchestrator | Monday 20 April 2026 00:48:48 +0000 (0:00:00.166) 0:00:26.190 ********** 2026-04-20 00:48:53.915255 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_0ixzo1i1/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_0ixzo1i1/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_0ixzo1i1/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_0ixzo1i1/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/openvswitch-db-server not found\")\\n'"} 2026-04-20 00:48:53.915277 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_w6yqff30/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_w6yqff30/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_w6yqff30/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_w6yqff30/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/openvswitch-db-server not found\")\\n'"} 2026-04-20 00:48:53.915296 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_5k6b0xyt/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_5k6b0xyt/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_5k6b0xyt/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_5k6b0xyt/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/openvswitch-db-server not found\")\\n'"} 2026-04-20 00:48:53.915436 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_cxl4int7/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_cxl4int7/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_cxl4int7/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_cxl4int7/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/openvswitch-db-server not found\")\\n'"} 2026-04-20 00:48:53.915466 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_w8s3d1ky/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_w8s3d1ky/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_w8s3d1ky/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_w8s3d1ky/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/openvswitch-db-server not found\")\\n'"} 2026-04-20 00:48:53.915479 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_p0se3kk6/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_p0se3kk6/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_p0se3kk6/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_p0se3kk6/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.5.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopenvswitch-db-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/openvswitch-db-server not found\")\\n'"} 2026-04-20 00:48:53.915493 | orchestrator | 2026-04-20 00:48:53.915499 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:48:53.915506 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:48:53.915519 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:48:53.915525 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:48:53.915531 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:48:53.915537 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:48:53.915592 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:48:53.915600 | orchestrator | 2026-04-20 00:48:53.915606 | orchestrator | 2026-04-20 00:48:53.915612 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:48:53.915618 | orchestrator | Monday 20 April 2026 00:48:51 +0000 (0:00:03.425) 0:00:29.616 ********** 2026-04-20 00:48:53.915625 | orchestrator | =============================================================================== 2026-04-20 00:48:53.915629 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------- 3.43s 2026-04-20 00:48:53.915642 | orchestrator | service-check-containers : Include tasks -------------------------------- 3.40s 2026-04-20 00:48:53.915648 | orchestrator | openvswitch : Copying over config.json files for services --------------- 3.10s 2026-04-20 00:48:53.915653 | orchestrator | service-check-containers : openvswitch | Check containers --------------- 2.95s 2026-04-20 00:48:53.915660 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 2.54s 2026-04-20 00:48:53.915665 | orchestrator | module-load : Load modules ---------------------------------------------- 2.20s 2026-04-20 00:48:53.915671 | orchestrator | service-check-containers : openvswitch | Notify handlers to restart containers --- 1.94s 2026-04-20 00:48:53.915677 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 1.92s 2026-04-20 00:48:53.915683 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 1.85s 2026-04-20 00:48:53.915689 | orchestrator | module-load : Drop module persistence ----------------------------------- 1.58s 2026-04-20 00:48:53.915695 | orchestrator | openvswitch : include_tasks --------------------------------------------- 1.12s 2026-04-20 00:48:53.915712 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.09s 2026-04-20 00:48:53.915719 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 0.87s 2026-04-20 00:48:53.915725 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 0.66s 2026-04-20 00:48:53.915732 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.62s 2026-04-20 00:48:53.915736 | orchestrator | 2026-04-20 00:48:53 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state STARTED 2026-04-20 00:48:53.915741 | orchestrator | 2026-04-20 00:48:53 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:53.917147 | orchestrator | 2026-04-20 00:48:53 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:53.918670 | orchestrator | 2026-04-20 00:48:53 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:53.918708 | orchestrator | 2026-04-20 00:48:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:56.954714 | orchestrator | 2026-04-20 00:48:56 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:56.954789 | orchestrator | 2026-04-20 00:48:56 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state STARTED 2026-04-20 00:48:56.955331 | orchestrator | 2026-04-20 00:48:56 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:56.956195 | orchestrator | 2026-04-20 00:48:56 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:56.957103 | orchestrator | 2026-04-20 00:48:56 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:56.957368 | orchestrator | 2026-04-20 00:48:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:48:59.985952 | orchestrator | 2026-04-20 00:48:59 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:48:59.994357 | orchestrator | 2026-04-20 00:48:59 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state STARTED 2026-04-20 00:48:59.994446 | orchestrator | 2026-04-20 00:48:59 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:48:59.994458 | orchestrator | 2026-04-20 00:48:59 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:48:59.994465 | orchestrator | 2026-04-20 00:48:59 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:48:59.994473 | orchestrator | 2026-04-20 00:48:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:03.022294 | orchestrator | 2026-04-20 00:49:03 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:03.023090 | orchestrator | 2026-04-20 00:49:03 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state STARTED 2026-04-20 00:49:03.023931 | orchestrator | 2026-04-20 00:49:03 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:03.024826 | orchestrator | 2026-04-20 00:49:03 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:49:03.025822 | orchestrator | 2026-04-20 00:49:03 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:03.025874 | orchestrator | 2026-04-20 00:49:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:06.076482 | orchestrator | 2026-04-20 00:49:06 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:06.077163 | orchestrator | 2026-04-20 00:49:06 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state STARTED 2026-04-20 00:49:06.078264 | orchestrator | 2026-04-20 00:49:06 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:06.079348 | orchestrator | 2026-04-20 00:49:06 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:49:06.080427 | orchestrator | 2026-04-20 00:49:06 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:06.080895 | orchestrator | 2026-04-20 00:49:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:09.105124 | orchestrator | 2026-04-20 00:49:09 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:09.106085 | orchestrator | 2026-04-20 00:49:09 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state STARTED 2026-04-20 00:49:09.107114 | orchestrator | 2026-04-20 00:49:09 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:09.108515 | orchestrator | 2026-04-20 00:49:09 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:49:09.109731 | orchestrator | 2026-04-20 00:49:09 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:09.109775 | orchestrator | 2026-04-20 00:49:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:12.139271 | orchestrator | 2026-04-20 00:49:12 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:12.142142 | orchestrator | 2026-04-20 00:49:12.142219 | orchestrator | 2026-04-20 00:49:12.142226 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:49:12.142232 | orchestrator | 2026-04-20 00:49:12.142236 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:49:12.142241 | orchestrator | Monday 20 April 2026 00:48:55 +0000 (0:00:00.203) 0:00:00.203 ********** 2026-04-20 00:49:12.142245 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:49:12.142250 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:49:12.142254 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:49:12.142258 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:49:12.142262 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:49:12.142265 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:49:12.142269 | orchestrator | 2026-04-20 00:49:12.142273 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:49:12.142277 | orchestrator | Monday 20 April 2026 00:48:56 +0000 (0:00:00.700) 0:00:00.903 ********** 2026-04-20 00:49:12.142281 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2026-04-20 00:49:12.142286 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2026-04-20 00:49:12.142289 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2026-04-20 00:49:12.142310 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2026-04-20 00:49:12.142316 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2026-04-20 00:49:12.142322 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2026-04-20 00:49:12.142328 | orchestrator | 2026-04-20 00:49:12.142334 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2026-04-20 00:49:12.142341 | orchestrator | 2026-04-20 00:49:12.142347 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2026-04-20 00:49:12.142354 | orchestrator | Monday 20 April 2026 00:48:56 +0000 (0:00:00.899) 0:00:01.802 ********** 2026-04-20 00:49:12.142362 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:49:12.142370 | orchestrator | 2026-04-20 00:49:12.142376 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2026-04-20 00:49:12.142384 | orchestrator | Monday 20 April 2026 00:48:58 +0000 (0:00:01.359) 0:00:03.162 ********** 2026-04-20 00:49:12.142391 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142415 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142419 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142511 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142516 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142534 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142539 | orchestrator | 2026-04-20 00:49:12.142543 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2026-04-20 00:49:12.142571 | orchestrator | Monday 20 April 2026 00:49:00 +0000 (0:00:01.740) 0:00:04.902 ********** 2026-04-20 00:49:12.142578 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142590 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142597 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142611 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142617 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142623 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142629 | orchestrator | 2026-04-20 00:49:12.142635 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2026-04-20 00:49:12.142641 | orchestrator | Monday 20 April 2026 00:49:01 +0000 (0:00:01.522) 0:00:06.425 ********** 2026-04-20 00:49:12.142647 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142654 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142668 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142678 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142684 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142704 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142709 | orchestrator | 2026-04-20 00:49:12.142713 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2026-04-20 00:49:12.142717 | orchestrator | Monday 20 April 2026 00:49:02 +0000 (0:00:01.329) 0:00:07.754 ********** 2026-04-20 00:49:12.142721 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142725 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142729 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142733 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142740 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142744 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142750 | orchestrator | 2026-04-20 00:49:12.142756 | orchestrator | TASK [service-check-containers : ovn_controller | Check containers] ************ 2026-04-20 00:49:12.142765 | orchestrator | Monday 20 April 2026 00:49:04 +0000 (0:00:01.541) 0:00:09.296 ********** 2026-04-20 00:49:12.142779 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142785 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142792 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142798 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142804 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142810 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 00:49:12.142816 | orchestrator | 2026-04-20 00:49:12.142822 | orchestrator | TASK [service-check-containers : ovn_controller | Notify handlers to restart containers] *** 2026-04-20 00:49:12.142829 | orchestrator | Monday 20 April 2026 00:49:05 +0000 (0:00:01.414) 0:00:10.710 ********** 2026-04-20 00:49:12.142835 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:49:12.142842 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:12.142849 | orchestrator | } 2026-04-20 00:49:12.142856 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:49:12.142862 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:12.142868 | orchestrator | } 2026-04-20 00:49:12.142929 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:49:12.142942 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:12.142945 | orchestrator | } 2026-04-20 00:49:12.142949 | orchestrator | changed: [testbed-node-3] => { 2026-04-20 00:49:12.142953 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:12.142957 | orchestrator | } 2026-04-20 00:49:12.142961 | orchestrator | changed: [testbed-node-4] => { 2026-04-20 00:49:12.142965 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:12.142977 | orchestrator | } 2026-04-20 00:49:12.142990 | orchestrator | changed: [testbed-node-5] => { 2026-04-20 00:49:12.142996 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:12.143002 | orchestrator | } 2026-04-20 00:49:12.143008 | orchestrator | 2026-04-20 00:49:12.143014 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:49:12.143021 | orchestrator | Monday 20 April 2026 00:49:06 +0000 (0:00:00.776) 0:00:11.487 ********** 2026-04-20 00:49:12.143028 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:49:12.143034 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:12.143045 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:49:12.143052 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:49:12.143056 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:49:12.143060 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:49:12.143064 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:49:12.143068 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:49:12.143072 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:49:12.143076 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:49:12.143080 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/ovn-controller:25.3.1.20260328', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:49:12.143084 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:49:12.143088 | orchestrator | 2026-04-20 00:49:12.143091 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2026-04-20 00:49:12.143095 | orchestrator | Monday 20 April 2026 00:49:08 +0000 (0:00:01.393) 0:00:12.881 ********** 2026-04-20 00:49:12.143099 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:49:12.143108 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:49:12.143113 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:49:12.143119 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:49:12.143124 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:49:12.143133 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:49:12.143139 | orchestrator | 2026-04-20 00:49:12.143145 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:49:12.143152 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:49:12.143161 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:49:12.143167 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:49:12.143177 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:49:12.143183 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:49:12.143190 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 2026-04-20 00:49:12.143197 | orchestrator | 2026-04-20 00:49:12.143203 | orchestrator | 2026-04-20 00:49:12.143210 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:49:12.143217 | orchestrator | Monday 20 April 2026 00:49:09 +0000 (0:00:01.319) 0:00:14.200 ********** 2026-04-20 00:49:12.143224 | orchestrator | =============================================================================== 2026-04-20 00:49:12.143231 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.74s 2026-04-20 00:49:12.143238 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 1.54s 2026-04-20 00:49:12.143243 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 1.52s 2026-04-20 00:49:12.143247 | orchestrator | service-check-containers : ovn_controller | Check containers ------------ 1.41s 2026-04-20 00:49:12.143252 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.39s 2026-04-20 00:49:12.143257 | orchestrator | ovn-controller : include_tasks ------------------------------------------ 1.36s 2026-04-20 00:49:12.143263 | orchestrator | ovn-controller : Ensuring systemd override directory exists ------------- 1.33s 2026-04-20 00:49:12.143270 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 1.32s 2026-04-20 00:49:12.143277 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.90s 2026-04-20 00:49:12.143283 | orchestrator | service-check-containers : ovn_controller | Notify handlers to restart containers --- 0.78s 2026-04-20 00:49:12.143291 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.70s 2026-04-20 00:49:12.143298 | orchestrator | 2026-04-20 00:49:12 | INFO  | Task a695f2f1-8705-4370-9599-0d8ac79a6b08 is in state SUCCESS 2026-04-20 00:49:12.143304 | orchestrator | 2026-04-20 00:49:12 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:12.144225 | orchestrator | 2026-04-20 00:49:12 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state STARTED 2026-04-20 00:49:12.145399 | orchestrator | 2026-04-20 00:49:12 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:12.145446 | orchestrator | 2026-04-20 00:49:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:15.178064 | orchestrator | 2026-04-20 00:49:15 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:15.178724 | orchestrator | 2026-04-20 00:49:15 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:15.180684 | orchestrator | 2026-04-20 00:49:15 | INFO  | Task 3ee1df91-9689-476b-b528-532e6eba695e is in state SUCCESS 2026-04-20 00:49:15.182113 | orchestrator | 2026-04-20 00:49:15.182147 | orchestrator | 2026-04-20 00:49:15.182155 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2026-04-20 00:49:15.182163 | orchestrator | 2026-04-20 00:49:15.182169 | orchestrator | TASK [Inform the user about the following task] ******************************** 2026-04-20 00:49:15.182176 | orchestrator | Monday 20 April 2026 00:48:37 +0000 (0:00:00.094) 0:00:00.094 ********** 2026-04-20 00:49:15.182183 | orchestrator | ok: [localhost] => { 2026-04-20 00:49:15.182191 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2026-04-20 00:49:15.182198 | orchestrator | } 2026-04-20 00:49:15.182205 | orchestrator | 2026-04-20 00:49:15.182212 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2026-04-20 00:49:15.182218 | orchestrator | Monday 20 April 2026 00:48:37 +0000 (0:00:00.043) 0:00:00.138 ********** 2026-04-20 00:49:15.182225 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2026-04-20 00:49:15.182232 | orchestrator | ...ignoring 2026-04-20 00:49:15.182239 | orchestrator | 2026-04-20 00:49:15.182246 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2026-04-20 00:49:15.182252 | orchestrator | Monday 20 April 2026 00:48:40 +0000 (0:00:03.217) 0:00:03.355 ********** 2026-04-20 00:49:15.182258 | orchestrator | skipping: [localhost] 2026-04-20 00:49:15.182264 | orchestrator | 2026-04-20 00:49:15.182271 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2026-04-20 00:49:15.182277 | orchestrator | Monday 20 April 2026 00:48:40 +0000 (0:00:00.219) 0:00:03.574 ********** 2026-04-20 00:49:15.182283 | orchestrator | ok: [localhost] 2026-04-20 00:49:15.182289 | orchestrator | 2026-04-20 00:49:15.182296 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:49:15.182302 | orchestrator | 2026-04-20 00:49:15.182308 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:49:15.182315 | orchestrator | Monday 20 April 2026 00:48:41 +0000 (0:00:00.769) 0:00:04.343 ********** 2026-04-20 00:49:15.182321 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:49:15.182327 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:49:15.182334 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:49:15.182340 | orchestrator | 2026-04-20 00:49:15.182346 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:49:15.182362 | orchestrator | Monday 20 April 2026 00:48:41 +0000 (0:00:00.509) 0:00:04.853 ********** 2026-04-20 00:49:15.182369 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2026-04-20 00:49:15.182376 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2026-04-20 00:49:15.182383 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2026-04-20 00:49:15.182389 | orchestrator | 2026-04-20 00:49:15.182396 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2026-04-20 00:49:15.182402 | orchestrator | 2026-04-20 00:49:15.182408 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-04-20 00:49:15.182414 | orchestrator | Monday 20 April 2026 00:48:43 +0000 (0:00:01.919) 0:00:06.772 ********** 2026-04-20 00:49:15.182432 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:49:15.182440 | orchestrator | 2026-04-20 00:49:15.182446 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2026-04-20 00:49:15.182453 | orchestrator | Monday 20 April 2026 00:48:46 +0000 (0:00:02.468) 0:00:09.241 ********** 2026-04-20 00:49:15.182459 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:49:15.182466 | orchestrator | 2026-04-20 00:49:15.182472 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2026-04-20 00:49:15.182478 | orchestrator | Monday 20 April 2026 00:48:48 +0000 (0:00:01.731) 0:00:10.972 ********** 2026-04-20 00:49:15.182485 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.182492 | orchestrator | 2026-04-20 00:49:15.182498 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2026-04-20 00:49:15.182505 | orchestrator | Monday 20 April 2026 00:48:48 +0000 (0:00:00.479) 0:00:11.452 ********** 2026-04-20 00:49:15.182511 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.182518 | orchestrator | 2026-04-20 00:49:15.182524 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2026-04-20 00:49:15.182530 | orchestrator | Monday 20 April 2026 00:48:49 +0000 (0:00:00.482) 0:00:11.934 ********** 2026-04-20 00:49:15.182536 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.182543 | orchestrator | 2026-04-20 00:49:15.182612 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2026-04-20 00:49:15.182621 | orchestrator | Monday 20 April 2026 00:48:50 +0000 (0:00:01.333) 0:00:13.268 ********** 2026-04-20 00:49:15.182628 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.182635 | orchestrator | 2026-04-20 00:49:15.182642 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-04-20 00:49:15.182649 | orchestrator | Monday 20 April 2026 00:48:50 +0000 (0:00:00.419) 0:00:13.687 ********** 2026-04-20 00:49:15.182656 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:49:15.182663 | orchestrator | 2026-04-20 00:49:15.182670 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2026-04-20 00:49:15.182676 | orchestrator | Monday 20 April 2026 00:48:51 +0000 (0:00:00.704) 0:00:14.392 ********** 2026-04-20 00:49:15.182682 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:49:15.182689 | orchestrator | 2026-04-20 00:49:15.182696 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2026-04-20 00:49:15.182703 | orchestrator | Monday 20 April 2026 00:48:52 +0000 (0:00:00.729) 0:00:15.121 ********** 2026-04-20 00:49:15.182709 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.182716 | orchestrator | 2026-04-20 00:49:15.182723 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2026-04-20 00:49:15.182730 | orchestrator | Monday 20 April 2026 00:48:52 +0000 (0:00:00.479) 0:00:15.601 ********** 2026-04-20 00:49:15.182737 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.182744 | orchestrator | 2026-04-20 00:49:15.182757 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2026-04-20 00:49:15.182765 | orchestrator | Monday 20 April 2026 00:48:52 +0000 (0:00:00.274) 0:00:15.875 ********** 2026-04-20 00:49:15.182775 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.182795 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.182805 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.182812 | orchestrator | 2026-04-20 00:49:15.182819 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2026-04-20 00:49:15.182826 | orchestrator | Monday 20 April 2026 00:48:54 +0000 (0:00:01.118) 0:00:16.994 ********** 2026-04-20 00:49:15.182840 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.182848 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.182862 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.182871 | orchestrator | 2026-04-20 00:49:15.182878 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2026-04-20 00:49:15.182884 | orchestrator | Monday 20 April 2026 00:48:55 +0000 (0:00:01.368) 0:00:18.362 ********** 2026-04-20 00:49:15.182891 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-04-20 00:49:15.182899 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-04-20 00:49:15.182906 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2026-04-20 00:49:15.182912 | orchestrator | 2026-04-20 00:49:15.182919 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2026-04-20 00:49:15.182926 | orchestrator | Monday 20 April 2026 00:48:56 +0000 (0:00:01.436) 0:00:19.799 ********** 2026-04-20 00:49:15.182933 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-04-20 00:49:15.182939 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-04-20 00:49:15.182946 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2026-04-20 00:49:15.182953 | orchestrator | 2026-04-20 00:49:15.182960 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2026-04-20 00:49:15.182966 | orchestrator | Monday 20 April 2026 00:48:59 +0000 (0:00:02.467) 0:00:22.267 ********** 2026-04-20 00:49:15.182973 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-04-20 00:49:15.182980 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-04-20 00:49:15.182986 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2026-04-20 00:49:15.182993 | orchestrator | 2026-04-20 00:49:15.183003 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2026-04-20 00:49:15.183014 | orchestrator | Monday 20 April 2026 00:49:00 +0000 (0:00:01.276) 0:00:23.543 ********** 2026-04-20 00:49:15.183021 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-04-20 00:49:15.183027 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-04-20 00:49:15.183034 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2026-04-20 00:49:15.183041 | orchestrator | 2026-04-20 00:49:15.183047 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2026-04-20 00:49:15.183054 | orchestrator | Monday 20 April 2026 00:49:02 +0000 (0:00:01.478) 0:00:25.021 ********** 2026-04-20 00:49:15.183061 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-04-20 00:49:15.183068 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-04-20 00:49:15.183074 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2026-04-20 00:49:15.183081 | orchestrator | 2026-04-20 00:49:15.183088 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2026-04-20 00:49:15.183095 | orchestrator | Monday 20 April 2026 00:49:03 +0000 (0:00:01.419) 0:00:26.441 ********** 2026-04-20 00:49:15.183101 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-04-20 00:49:15.183108 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-04-20 00:49:15.183115 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2026-04-20 00:49:15.183122 | orchestrator | 2026-04-20 00:49:15.183129 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2026-04-20 00:49:15.183136 | orchestrator | Monday 20 April 2026 00:49:05 +0000 (0:00:01.490) 0:00:27.931 ********** 2026-04-20 00:49:15.183143 | orchestrator | included: /ansible/roles/rabbitmq/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:49:15.183150 | orchestrator | 2026-04-20 00:49:15.183159 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over extra CA certificates] ******* 2026-04-20 00:49:15.183166 | orchestrator | Monday 20 April 2026 00:49:05 +0000 (0:00:00.704) 0:00:28.636 ********** 2026-04-20 00:49:15.183173 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.183180 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.183196 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.183204 | orchestrator | 2026-04-20 00:49:15.183211 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over backend internal TLS certificate] *** 2026-04-20 00:49:15.183217 | orchestrator | Monday 20 April 2026 00:49:06 +0000 (0:00:01.165) 0:00:29.801 ********** 2026-04-20 00:49:15.183227 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183235 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.183242 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183253 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:49:15.183265 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183272 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:49:15.183279 | orchestrator | 2026-04-20 00:49:15.183286 | orchestrator | TASK [service-cert-copy : rabbitmq | Copying over backend internal TLS key] **** 2026-04-20 00:49:15.183293 | orchestrator | Monday 20 April 2026 00:49:07 +0000 (0:00:00.378) 0:00:30.180 ********** 2026-04-20 00:49:15.183300 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183311 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183319 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.183326 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:49:15.183332 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183344 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:49:15.183351 | orchestrator | 2026-04-20 00:49:15.183358 | orchestrator | TASK [service-check-containers : rabbitmq | Check containers] ****************** 2026-04-20 00:49:15.183364 | orchestrator | Monday 20 April 2026 00:49:07 +0000 (0:00:00.587) 0:00:30.767 ********** 2026-04-20 00:49:15.183375 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.183385 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.183394 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:49:15.183409 | orchestrator | 2026-04-20 00:49:15.183416 | orchestrator | TASK [service-check-containers : rabbitmq | Notify handlers to restart containers] *** 2026-04-20 00:49:15.183423 | orchestrator | Monday 20 April 2026 00:49:09 +0000 (0:00:01.141) 0:00:31.909 ********** 2026-04-20 00:49:15.183430 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:49:15.183436 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:15.183443 | orchestrator | } 2026-04-20 00:49:15.183450 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:49:15.183457 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:15.183463 | orchestrator | } 2026-04-20 00:49:15.183470 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:49:15.183477 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:49:15.183484 | orchestrator | } 2026-04-20 00:49:15.183490 | orchestrator | 2026-04-20 00:49:15.183497 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:49:15.183503 | orchestrator | Monday 20 April 2026 00:49:09 +0000 (0:00:00.278) 0:00:32.188 ********** 2026-04-20 00:49:15.183515 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183525 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183533 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:49:15.183540 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:49:15.183546 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:49:15.183568 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:49:15.183575 | orchestrator | 2026-04-20 00:49:15.183581 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2026-04-20 00:49:15.183587 | orchestrator | Monday 20 April 2026 00:49:10 +0000 (0:00:00.803) 0:00:32.991 ********** 2026-04-20 00:49:15.183593 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:49:15.183600 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:49:15.183606 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:49:15.183613 | orchestrator | 2026-04-20 00:49:15.183619 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2026-04-20 00:49:15.183625 | orchestrator | Monday 20 April 2026 00:49:11 +0000 (0:00:00.979) 0:00:33.970 ********** 2026-04-20 00:49:15.183642 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Frabbitmq\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_bwqjdvbt/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_bwqjdvbt/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_bwqjdvbt/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Frabbitmq: Internal Server Error (\"unknown: repository kolla/release/2024.2/rabbitmq not found\")\\n'"} 2026-04-20 00:49:15.183651 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Frabbitmq\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_t_h8jlgb/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_t_h8jlgb/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_t_h8jlgb/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Frabbitmq: Internal Server Error (\"unknown: repository kolla/release/2024.2/rabbitmq not found\")\\n'"} 2026-04-20 00:49:15.183670 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Frabbitmq\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_lk4w136z/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_lk4w136z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_lk4w136z/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=4.1.8.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Frabbitmq: Internal Server Error (\"unknown: repository kolla/release/2024.2/rabbitmq not found\")\\n'"} 2026-04-20 00:49:15.183682 | orchestrator | 2026-04-20 00:49:15.183689 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:49:15.183696 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-20 00:49:15.183704 | orchestrator | testbed-node-0 : ok=19  changed=12  unreachable=0 failed=1  skipped=9  rescued=0 ignored=0 2026-04-20 00:49:15.183710 | orchestrator | testbed-node-1 : ok=17  changed=12  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2026-04-20 00:49:15.183716 | orchestrator | testbed-node-2 : ok=17  changed=12  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2026-04-20 00:49:15.183723 | orchestrator | 2026-04-20 00:49:15.183730 | orchestrator | 2026-04-20 00:49:15.183736 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:49:15.183742 | orchestrator | Monday 20 April 2026 00:49:12 +0000 (0:00:01.122) 0:00:35.093 ********** 2026-04-20 00:49:15.183749 | orchestrator | =============================================================================== 2026-04-20 00:49:15.183755 | orchestrator | Check RabbitMQ service -------------------------------------------------- 3.22s 2026-04-20 00:49:15.183761 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 2.47s 2026-04-20 00:49:15.183767 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 2.47s 2026-04-20 00:49:15.183773 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.92s 2026-04-20 00:49:15.183780 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 1.73s 2026-04-20 00:49:15.183786 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 1.49s 2026-04-20 00:49:15.183792 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 1.48s 2026-04-20 00:49:15.183799 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 1.44s 2026-04-20 00:49:15.183805 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 1.42s 2026-04-20 00:49:15.183811 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 1.37s 2026-04-20 00:49:15.183817 | orchestrator | rabbitmq : Check if running RabbitMQ is at most one version behind ------ 1.33s 2026-04-20 00:49:15.183827 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 1.28s 2026-04-20 00:49:15.183833 | orchestrator | service-cert-copy : rabbitmq | Copying over extra CA certificates ------- 1.17s 2026-04-20 00:49:15.183839 | orchestrator | service-check-containers : rabbitmq | Check containers ------------------ 1.14s 2026-04-20 00:49:15.183846 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 1.12s 2026-04-20 00:49:15.183853 | orchestrator | rabbitmq : Ensuring config directories exist ---------------------------- 1.12s 2026-04-20 00:49:15.183859 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 0.98s 2026-04-20 00:49:15.183865 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.80s 2026-04-20 00:49:15.183872 | orchestrator | Set kolla_action_rabbitmq = kolla_action_ng ----------------------------- 0.77s 2026-04-20 00:49:15.183878 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 0.73s 2026-04-20 00:49:15.183884 | orchestrator | 2026-04-20 00:49:15 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:15.183894 | orchestrator | 2026-04-20 00:49:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:18.238435 | orchestrator | 2026-04-20 00:49:18 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:18.240181 | orchestrator | 2026-04-20 00:49:18 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:18.242379 | orchestrator | 2026-04-20 00:49:18 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:18.242809 | orchestrator | 2026-04-20 00:49:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:21.281891 | orchestrator | 2026-04-20 00:49:21 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:21.283393 | orchestrator | 2026-04-20 00:49:21 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:21.284213 | orchestrator | 2026-04-20 00:49:21 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:21.284487 | orchestrator | 2026-04-20 00:49:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:24.328096 | orchestrator | 2026-04-20 00:49:24 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:24.329784 | orchestrator | 2026-04-20 00:49:24 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:24.331502 | orchestrator | 2026-04-20 00:49:24 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:24.331885 | orchestrator | 2026-04-20 00:49:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:27.384400 | orchestrator | 2026-04-20 00:49:27 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:27.385472 | orchestrator | 2026-04-20 00:49:27 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:27.386438 | orchestrator | 2026-04-20 00:49:27 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:27.386488 | orchestrator | 2026-04-20 00:49:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:30.422310 | orchestrator | 2026-04-20 00:49:30 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:30.423431 | orchestrator | 2026-04-20 00:49:30 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:30.423486 | orchestrator | 2026-04-20 00:49:30 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:30.423588 | orchestrator | 2026-04-20 00:49:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:33.464388 | orchestrator | 2026-04-20 00:49:33 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:33.464496 | orchestrator | 2026-04-20 00:49:33 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:33.467696 | orchestrator | 2026-04-20 00:49:33 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:33.467821 | orchestrator | 2026-04-20 00:49:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:36.547891 | orchestrator | 2026-04-20 00:49:36 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:36.547989 | orchestrator | 2026-04-20 00:49:36 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:36.549010 | orchestrator | 2026-04-20 00:49:36 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:36.549053 | orchestrator | 2026-04-20 00:49:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:39.579860 | orchestrator | 2026-04-20 00:49:39 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:39.581445 | orchestrator | 2026-04-20 00:49:39 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:39.583689 | orchestrator | 2026-04-20 00:49:39 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:39.583839 | orchestrator | 2026-04-20 00:49:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:42.625319 | orchestrator | 2026-04-20 00:49:42 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:42.625770 | orchestrator | 2026-04-20 00:49:42 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:42.627430 | orchestrator | 2026-04-20 00:49:42 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:42.627893 | orchestrator | 2026-04-20 00:49:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:45.666368 | orchestrator | 2026-04-20 00:49:45 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:45.668013 | orchestrator | 2026-04-20 00:49:45 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:45.670550 | orchestrator | 2026-04-20 00:49:45 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:45.670711 | orchestrator | 2026-04-20 00:49:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:48.723017 | orchestrator | 2026-04-20 00:49:48 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:48.723102 | orchestrator | 2026-04-20 00:49:48 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:48.724356 | orchestrator | 2026-04-20 00:49:48 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:48.724383 | orchestrator | 2026-04-20 00:49:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:51.762465 | orchestrator | 2026-04-20 00:49:51 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:51.762678 | orchestrator | 2026-04-20 00:49:51 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:51.763743 | orchestrator | 2026-04-20 00:49:51 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:51.763769 | orchestrator | 2026-04-20 00:49:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:54.951224 | orchestrator | 2026-04-20 00:49:54 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:54.951797 | orchestrator | 2026-04-20 00:49:54 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:54.953711 | orchestrator | 2026-04-20 00:49:54 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:54.953757 | orchestrator | 2026-04-20 00:49:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:49:57.984896 | orchestrator | 2026-04-20 00:49:57 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:49:57.987057 | orchestrator | 2026-04-20 00:49:57 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:49:57.987147 | orchestrator | 2026-04-20 00:49:57 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:49:57.987157 | orchestrator | 2026-04-20 00:49:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:01.055069 | orchestrator | 2026-04-20 00:50:01 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:01.055378 | orchestrator | 2026-04-20 00:50:01 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:01.060879 | orchestrator | 2026-04-20 00:50:01 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:01.060925 | orchestrator | 2026-04-20 00:50:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:04.107174 | orchestrator | 2026-04-20 00:50:04 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:04.107251 | orchestrator | 2026-04-20 00:50:04 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:04.107259 | orchestrator | 2026-04-20 00:50:04 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:04.107267 | orchestrator | 2026-04-20 00:50:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:07.348130 | orchestrator | 2026-04-20 00:50:07 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:07.348228 | orchestrator | 2026-04-20 00:50:07 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:07.350493 | orchestrator | 2026-04-20 00:50:07 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:07.350632 | orchestrator | 2026-04-20 00:50:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:10.478386 | orchestrator | 2026-04-20 00:50:10 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:10.478523 | orchestrator | 2026-04-20 00:50:10 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:10.478539 | orchestrator | 2026-04-20 00:50:10 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:10.478550 | orchestrator | 2026-04-20 00:50:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:13.436642 | orchestrator | 2026-04-20 00:50:13 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:13.437461 | orchestrator | 2026-04-20 00:50:13 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:13.437981 | orchestrator | 2026-04-20 00:50:13 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:13.438170 | orchestrator | 2026-04-20 00:50:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:16.487991 | orchestrator | 2026-04-20 00:50:16 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:16.488087 | orchestrator | 2026-04-20 00:50:16 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:16.488099 | orchestrator | 2026-04-20 00:50:16 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:16.488106 | orchestrator | 2026-04-20 00:50:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:19.523671 | orchestrator | 2026-04-20 00:50:19 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:19.525514 | orchestrator | 2026-04-20 00:50:19 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:19.527455 | orchestrator | 2026-04-20 00:50:19 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:19.527522 | orchestrator | 2026-04-20 00:50:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:22.559389 | orchestrator | 2026-04-20 00:50:22 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:22.559534 | orchestrator | 2026-04-20 00:50:22 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:22.559665 | orchestrator | 2026-04-20 00:50:22 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:22.559676 | orchestrator | 2026-04-20 00:50:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:25.599011 | orchestrator | 2026-04-20 00:50:25 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:25.600941 | orchestrator | 2026-04-20 00:50:25 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:25.600998 | orchestrator | 2026-04-20 00:50:25 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:25.601005 | orchestrator | 2026-04-20 00:50:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:28.631143 | orchestrator | 2026-04-20 00:50:28 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:28.631239 | orchestrator | 2026-04-20 00:50:28 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:28.633602 | orchestrator | 2026-04-20 00:50:28 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:28.633700 | orchestrator | 2026-04-20 00:50:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:31.663167 | orchestrator | 2026-04-20 00:50:31 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:31.664802 | orchestrator | 2026-04-20 00:50:31 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:31.665754 | orchestrator | 2026-04-20 00:50:31 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:31.665792 | orchestrator | 2026-04-20 00:50:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:34.708521 | orchestrator | 2026-04-20 00:50:34 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:34.714330 | orchestrator | 2026-04-20 00:50:34 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:34.716270 | orchestrator | 2026-04-20 00:50:34 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:34.716326 | orchestrator | 2026-04-20 00:50:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:37.751101 | orchestrator | 2026-04-20 00:50:37 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:37.753205 | orchestrator | 2026-04-20 00:50:37 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:37.755124 | orchestrator | 2026-04-20 00:50:37 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:37.755355 | orchestrator | 2026-04-20 00:50:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:40.783258 | orchestrator | 2026-04-20 00:50:40 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:40.783848 | orchestrator | 2026-04-20 00:50:40 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:40.784518 | orchestrator | 2026-04-20 00:50:40 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:40.784599 | orchestrator | 2026-04-20 00:50:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:43.831512 | orchestrator | 2026-04-20 00:50:43 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:43.831598 | orchestrator | 2026-04-20 00:50:43 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:43.832523 | orchestrator | 2026-04-20 00:50:43 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:43.832545 | orchestrator | 2026-04-20 00:50:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:46.876032 | orchestrator | 2026-04-20 00:50:46 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:46.877740 | orchestrator | 2026-04-20 00:50:46 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:46.879247 | orchestrator | 2026-04-20 00:50:46 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:46.879585 | orchestrator | 2026-04-20 00:50:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:49.918324 | orchestrator | 2026-04-20 00:50:49 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:49.919071 | orchestrator | 2026-04-20 00:50:49 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:49.919847 | orchestrator | 2026-04-20 00:50:49 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:49.920044 | orchestrator | 2026-04-20 00:50:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:52.971317 | orchestrator | 2026-04-20 00:50:52 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:52.974492 | orchestrator | 2026-04-20 00:50:52 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:52.976424 | orchestrator | 2026-04-20 00:50:52 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:52.976510 | orchestrator | 2026-04-20 00:50:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:56.011220 | orchestrator | 2026-04-20 00:50:56 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:56.012709 | orchestrator | 2026-04-20 00:50:56 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:56.014269 | orchestrator | 2026-04-20 00:50:56 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:56.014337 | orchestrator | 2026-04-20 00:50:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:50:59.056620 | orchestrator | 2026-04-20 00:50:59 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:50:59.057890 | orchestrator | 2026-04-20 00:50:59 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:50:59.059987 | orchestrator | 2026-04-20 00:50:59 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:50:59.060183 | orchestrator | 2026-04-20 00:50:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:02.134942 | orchestrator | 2026-04-20 00:51:02 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:02.135008 | orchestrator | 2026-04-20 00:51:02 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:02.135016 | orchestrator | 2026-04-20 00:51:02 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:02.135020 | orchestrator | 2026-04-20 00:51:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:05.146463 | orchestrator | 2026-04-20 00:51:05 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:05.146527 | orchestrator | 2026-04-20 00:51:05 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:05.146784 | orchestrator | 2026-04-20 00:51:05 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:05.146920 | orchestrator | 2026-04-20 00:51:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:08.190798 | orchestrator | 2026-04-20 00:51:08 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:08.193563 | orchestrator | 2026-04-20 00:51:08 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:08.195787 | orchestrator | 2026-04-20 00:51:08 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:08.196071 | orchestrator | 2026-04-20 00:51:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:11.235775 | orchestrator | 2026-04-20 00:51:11 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:11.237887 | orchestrator | 2026-04-20 00:51:11 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:11.239519 | orchestrator | 2026-04-20 00:51:11 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:11.239589 | orchestrator | 2026-04-20 00:51:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:14.291666 | orchestrator | 2026-04-20 00:51:14 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:14.292883 | orchestrator | 2026-04-20 00:51:14 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:14.294163 | orchestrator | 2026-04-20 00:51:14 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:14.294220 | orchestrator | 2026-04-20 00:51:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:17.333733 | orchestrator | 2026-04-20 00:51:17 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:17.336472 | orchestrator | 2026-04-20 00:51:17 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:17.338782 | orchestrator | 2026-04-20 00:51:17 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:17.338914 | orchestrator | 2026-04-20 00:51:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:20.387062 | orchestrator | 2026-04-20 00:51:20 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:20.389081 | orchestrator | 2026-04-20 00:51:20 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:20.390899 | orchestrator | 2026-04-20 00:51:20 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:20.391011 | orchestrator | 2026-04-20 00:51:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:23.435180 | orchestrator | 2026-04-20 00:51:23 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:23.435299 | orchestrator | 2026-04-20 00:51:23 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:23.436180 | orchestrator | 2026-04-20 00:51:23 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:23.436240 | orchestrator | 2026-04-20 00:51:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:26.517439 | orchestrator | 2026-04-20 00:51:26 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:26.519240 | orchestrator | 2026-04-20 00:51:26 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:26.519869 | orchestrator | 2026-04-20 00:51:26 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:26.521745 | orchestrator | 2026-04-20 00:51:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:29.582176 | orchestrator | 2026-04-20 00:51:29 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:29.582495 | orchestrator | 2026-04-20 00:51:29 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:29.583371 | orchestrator | 2026-04-20 00:51:29 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:29.583412 | orchestrator | 2026-04-20 00:51:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:32.616778 | orchestrator | 2026-04-20 00:51:32 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state STARTED 2026-04-20 00:51:32.616853 | orchestrator | 2026-04-20 00:51:32 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:32.617069 | orchestrator | 2026-04-20 00:51:32 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:32.617080 | orchestrator | 2026-04-20 00:51:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:35.666002 | orchestrator | 2026-04-20 00:51:35.666196 | orchestrator | 2026-04-20 00:51:35.666204 | orchestrator | PLAY [Prepare all k3s nodes] *************************************************** 2026-04-20 00:51:35.666210 | orchestrator | 2026-04-20 00:51:35.666227 | orchestrator | TASK [k3s_prereq : Validating arguments against arg spec 'main' - Prerequisites] *** 2026-04-20 00:51:35.666234 | orchestrator | Monday 20 April 2026 00:47:02 +0000 (0:00:00.273) 0:00:00.273 ********** 2026-04-20 00:51:35.666244 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:51:35.666251 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:51:35.666255 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:51:35.666260 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.666264 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.666269 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.666363 | orchestrator | 2026-04-20 00:51:35.666370 | orchestrator | TASK [k3s_prereq : Set same timezone on every Server] ************************** 2026-04-20 00:51:35.666374 | orchestrator | Monday 20 April 2026 00:47:02 +0000 (0:00:00.583) 0:00:00.856 ********** 2026-04-20 00:51:35.666379 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666384 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666388 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666405 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666416 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666420 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.666425 | orchestrator | 2026-04-20 00:51:35.666429 | orchestrator | TASK [k3s_prereq : Set SELinux to disabled state] ****************************** 2026-04-20 00:51:35.666433 | orchestrator | Monday 20 April 2026 00:47:03 +0000 (0:00:00.648) 0:00:01.505 ********** 2026-04-20 00:51:35.666438 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666442 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666446 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666450 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666455 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666459 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.666463 | orchestrator | 2026-04-20 00:51:35.666468 | orchestrator | TASK [k3s_prereq : Enable IPv4 forwarding] ************************************* 2026-04-20 00:51:35.666472 | orchestrator | Monday 20 April 2026 00:47:03 +0000 (0:00:00.462) 0:00:01.967 ********** 2026-04-20 00:51:35.666476 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.666480 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.666484 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.666489 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.666493 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.666497 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.666501 | orchestrator | 2026-04-20 00:51:35.666522 | orchestrator | TASK [k3s_prereq : Enable IPv6 forwarding] ************************************* 2026-04-20 00:51:35.666592 | orchestrator | Monday 20 April 2026 00:47:05 +0000 (0:00:01.798) 0:00:03.765 ********** 2026-04-20 00:51:35.666597 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.666601 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.666605 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.666610 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.666614 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.666618 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.666622 | orchestrator | 2026-04-20 00:51:35.666627 | orchestrator | TASK [k3s_prereq : Enable IPv6 router advertisements] ************************** 2026-04-20 00:51:35.666631 | orchestrator | Monday 20 April 2026 00:47:07 +0000 (0:00:01.400) 0:00:05.165 ********** 2026-04-20 00:51:35.666635 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.666640 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.666644 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.666648 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.666652 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.666657 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.666661 | orchestrator | 2026-04-20 00:51:35.666665 | orchestrator | TASK [k3s_prereq : Add br_netfilter to /etc/modules-load.d/] ******************* 2026-04-20 00:51:35.666670 | orchestrator | Monday 20 April 2026 00:47:08 +0000 (0:00:01.799) 0:00:06.965 ********** 2026-04-20 00:51:35.666674 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666678 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666682 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666686 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666691 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666695 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.666699 | orchestrator | 2026-04-20 00:51:35.666703 | orchestrator | TASK [k3s_prereq : Load br_netfilter] ****************************************** 2026-04-20 00:51:35.666708 | orchestrator | Monday 20 April 2026 00:47:09 +0000 (0:00:00.889) 0:00:07.854 ********** 2026-04-20 00:51:35.666712 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666716 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666721 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666725 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666729 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666733 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.666737 | orchestrator | 2026-04-20 00:51:35.666742 | orchestrator | TASK [k3s_prereq : Set bridge-nf-call-iptables (just to be sure)] ************** 2026-04-20 00:51:35.666746 | orchestrator | Monday 20 April 2026 00:47:10 +0000 (0:00:00.705) 0:00:08.559 ********** 2026-04-20 00:51:35.666750 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-20 00:51:35.666754 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-20 00:51:35.666759 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666763 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-20 00:51:35.666767 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-20 00:51:35.666771 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666775 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-20 00:51:35.666780 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-20 00:51:35.666784 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666788 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-20 00:51:35.666804 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-20 00:51:35.666809 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-20 00:51:35.666813 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-20 00:51:35.666824 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666829 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666833 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-iptables)  2026-04-20 00:51:35.666837 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-ip6tables)  2026-04-20 00:51:35.666841 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.666846 | orchestrator | 2026-04-20 00:51:35.666850 | orchestrator | TASK [k3s_prereq : Add /usr/local/bin to sudo secure_path] ********************* 2026-04-20 00:51:35.666854 | orchestrator | Monday 20 April 2026 00:47:11 +0000 (0:00:00.979) 0:00:09.538 ********** 2026-04-20 00:51:35.666858 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666863 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666871 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666875 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666879 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666884 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.666888 | orchestrator | 2026-04-20 00:51:35.666892 | orchestrator | TASK [k3s_download : Validating arguments against arg spec 'main' - Manage the downloading of K3S binaries] *** 2026-04-20 00:51:35.666897 | orchestrator | Monday 20 April 2026 00:47:12 +0000 (0:00:01.430) 0:00:10.969 ********** 2026-04-20 00:51:35.666901 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:51:35.666905 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:51:35.666910 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:51:35.666914 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.666918 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.666922 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.666927 | orchestrator | 2026-04-20 00:51:35.666931 | orchestrator | TASK [k3s_download : Download k3s binary x64] ********************************** 2026-04-20 00:51:35.666935 | orchestrator | Monday 20 April 2026 00:47:13 +0000 (0:00:00.726) 0:00:11.696 ********** 2026-04-20 00:51:35.666939 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.666944 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.666948 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.666952 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.666956 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.666961 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.666965 | orchestrator | 2026-04-20 00:51:35.666969 | orchestrator | TASK [k3s_download : Download k3s binary arm64] ******************************** 2026-04-20 00:51:35.666973 | orchestrator | Monday 20 April 2026 00:47:19 +0000 (0:00:05.852) 0:00:17.548 ********** 2026-04-20 00:51:35.666978 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.666982 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.666986 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.666990 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.666994 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.666999 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667004 | orchestrator | 2026-04-20 00:51:35.667009 | orchestrator | TASK [k3s_download : Download k3s binary armhf] ******************************** 2026-04-20 00:51:35.667014 | orchestrator | Monday 20 April 2026 00:47:21 +0000 (0:00:01.618) 0:00:19.171 ********** 2026-04-20 00:51:35.667019 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.667077 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.667083 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.667088 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667093 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667098 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667103 | orchestrator | 2026-04-20 00:51:35.667108 | orchestrator | TASK [k3s_custom_registries : Validating arguments against arg spec 'main' - Configure the use of a custom container registry] *** 2026-04-20 00:51:35.667115 | orchestrator | Monday 20 April 2026 00:47:24 +0000 (0:00:03.520) 0:00:22.691 ********** 2026-04-20 00:51:35.667120 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.667131 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.667136 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.667141 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667146 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667150 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667154 | orchestrator | 2026-04-20 00:51:35.667159 | orchestrator | TASK [k3s_custom_registries : Create directory /etc/rancher/k3s] *************** 2026-04-20 00:51:35.667163 | orchestrator | Monday 20 April 2026 00:47:27 +0000 (0:00:02.777) 0:00:25.469 ********** 2026-04-20 00:51:35.667168 | orchestrator | skipping: [testbed-node-3] => (item=rancher)  2026-04-20 00:51:35.667173 | orchestrator | skipping: [testbed-node-3] => (item=rancher/k3s)  2026-04-20 00:51:35.667177 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.667181 | orchestrator | skipping: [testbed-node-4] => (item=rancher)  2026-04-20 00:51:35.667186 | orchestrator | skipping: [testbed-node-4] => (item=rancher/k3s)  2026-04-20 00:51:35.667190 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.667194 | orchestrator | skipping: [testbed-node-5] => (item=rancher)  2026-04-20 00:51:35.667198 | orchestrator | skipping: [testbed-node-5] => (item=rancher/k3s)  2026-04-20 00:51:35.667203 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.667207 | orchestrator | skipping: [testbed-node-0] => (item=rancher)  2026-04-20 00:51:35.667211 | orchestrator | skipping: [testbed-node-0] => (item=rancher/k3s)  2026-04-20 00:51:35.667216 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667220 | orchestrator | skipping: [testbed-node-1] => (item=rancher)  2026-04-20 00:51:35.667224 | orchestrator | skipping: [testbed-node-1] => (item=rancher/k3s)  2026-04-20 00:51:35.667228 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667233 | orchestrator | skipping: [testbed-node-2] => (item=rancher)  2026-04-20 00:51:35.667237 | orchestrator | skipping: [testbed-node-2] => (item=rancher/k3s)  2026-04-20 00:51:35.667241 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667246 | orchestrator | 2026-04-20 00:51:35.667250 | orchestrator | TASK [k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml] *** 2026-04-20 00:51:35.667260 | orchestrator | Monday 20 April 2026 00:47:28 +0000 (0:00:01.451) 0:00:26.920 ********** 2026-04-20 00:51:35.667265 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.667269 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.667273 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.667278 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667282 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667286 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667291 | orchestrator | 2026-04-20 00:51:35.667295 | orchestrator | TASK [k3s_custom_registries : Remove /etc/rancher/k3s/registries.yaml when no registries configured] *** 2026-04-20 00:51:35.667299 | orchestrator | Monday 20 April 2026 00:47:30 +0000 (0:00:01.198) 0:00:28.119 ********** 2026-04-20 00:51:35.667304 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.667308 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.667312 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.667316 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667321 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667325 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667329 | orchestrator | 2026-04-20 00:51:35.667337 | orchestrator | PLAY [Deploy k3s master nodes] ************************************************* 2026-04-20 00:51:35.667342 | orchestrator | 2026-04-20 00:51:35.667346 | orchestrator | TASK [k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers] *** 2026-04-20 00:51:35.667351 | orchestrator | Monday 20 April 2026 00:47:31 +0000 (0:00:01.488) 0:00:29.608 ********** 2026-04-20 00:51:35.667355 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667359 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667364 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667368 | orchestrator | 2026-04-20 00:51:35.667372 | orchestrator | TASK [k3s_server : Stop k3s-init] ********************************************** 2026-04-20 00:51:35.667380 | orchestrator | Monday 20 April 2026 00:47:33 +0000 (0:00:02.010) 0:00:31.618 ********** 2026-04-20 00:51:35.667385 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667389 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667393 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667398 | orchestrator | 2026-04-20 00:51:35.667402 | orchestrator | TASK [k3s_server : Stop k3s] *************************************************** 2026-04-20 00:51:35.667406 | orchestrator | Monday 20 April 2026 00:47:34 +0000 (0:00:01.293) 0:00:32.912 ********** 2026-04-20 00:51:35.667411 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667415 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667419 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667424 | orchestrator | 2026-04-20 00:51:35.667431 | orchestrator | TASK [k3s_server : Clean previous runs of k3s-init] **************************** 2026-04-20 00:51:35.667438 | orchestrator | Monday 20 April 2026 00:47:35 +0000 (0:00:00.961) 0:00:33.874 ********** 2026-04-20 00:51:35.667444 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667451 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667457 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667464 | orchestrator | 2026-04-20 00:51:35.667471 | orchestrator | TASK [k3s_server : Deploy K3s http_proxy conf] ********************************* 2026-04-20 00:51:35.667477 | orchestrator | Monday 20 April 2026 00:47:36 +0000 (0:00:01.081) 0:00:34.956 ********** 2026-04-20 00:51:35.667484 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667491 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667499 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667505 | orchestrator | 2026-04-20 00:51:35.667513 | orchestrator | TASK [k3s_server : Create /etc/rancher/k3s directory] ************************** 2026-04-20 00:51:35.667518 | orchestrator | Monday 20 April 2026 00:47:37 +0000 (0:00:00.685) 0:00:35.642 ********** 2026-04-20 00:51:35.667522 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.667543 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.667550 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.667557 | orchestrator | 2026-04-20 00:51:35.667564 | orchestrator | TASK [k3s_server : Create custom resolv.conf for k3s] ************************** 2026-04-20 00:51:35.667571 | orchestrator | Monday 20 April 2026 00:47:38 +0000 (0:00:01.056) 0:00:36.698 ********** 2026-04-20 00:51:35.667578 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.667585 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.667593 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.667598 | orchestrator | 2026-04-20 00:51:35.667602 | orchestrator | TASK [k3s_server : Deploy vip manifest] **************************************** 2026-04-20 00:51:35.667606 | orchestrator | Monday 20 April 2026 00:47:39 +0000 (0:00:01.251) 0:00:37.950 ********** 2026-04-20 00:51:35.667611 | orchestrator | included: /ansible/roles/k3s_server/tasks/vip.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:51:35.667615 | orchestrator | 2026-04-20 00:51:35.667619 | orchestrator | TASK [k3s_server : Set _kube_vip_bgp_peers fact] ******************************* 2026-04-20 00:51:35.667624 | orchestrator | Monday 20 April 2026 00:47:40 +0000 (0:00:00.674) 0:00:38.624 ********** 2026-04-20 00:51:35.667628 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667632 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667636 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667655 | orchestrator | 2026-04-20 00:51:35.667660 | orchestrator | TASK [k3s_server : Create manifests directory on first master] ***************** 2026-04-20 00:51:35.667665 | orchestrator | Monday 20 April 2026 00:47:44 +0000 (0:00:03.653) 0:00:42.277 ********** 2026-04-20 00:51:35.667669 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667673 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667678 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.667682 | orchestrator | 2026-04-20 00:51:35.667686 | orchestrator | TASK [k3s_server : Download vip rbac manifest to first master] ***************** 2026-04-20 00:51:35.667691 | orchestrator | Monday 20 April 2026 00:47:45 +0000 (0:00:00.844) 0:00:43.122 ********** 2026-04-20 00:51:35.667695 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667703 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667708 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.667712 | orchestrator | 2026-04-20 00:51:35.667716 | orchestrator | TASK [k3s_server : Copy vip manifest to first master] ************************** 2026-04-20 00:51:35.667721 | orchestrator | Monday 20 April 2026 00:47:46 +0000 (0:00:01.777) 0:00:44.900 ********** 2026-04-20 00:51:35.667725 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667729 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667733 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.667738 | orchestrator | 2026-04-20 00:51:35.667742 | orchestrator | TASK [k3s_server : Deploy metallb manifest] ************************************ 2026-04-20 00:51:35.667751 | orchestrator | Monday 20 April 2026 00:47:48 +0000 (0:00:01.747) 0:00:46.647 ********** 2026-04-20 00:51:35.667755 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667760 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667764 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667768 | orchestrator | 2026-04-20 00:51:35.667772 | orchestrator | TASK [k3s_server : Deploy kube-vip manifest] *********************************** 2026-04-20 00:51:35.667777 | orchestrator | Monday 20 April 2026 00:47:49 +0000 (0:00:00.502) 0:00:47.150 ********** 2026-04-20 00:51:35.667781 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667785 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667789 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667794 | orchestrator | 2026-04-20 00:51:35.667798 | orchestrator | TASK [k3s_server : Init cluster inside the transient k3s-init service] ********* 2026-04-20 00:51:35.667802 | orchestrator | Monday 20 April 2026 00:47:49 +0000 (0:00:00.657) 0:00:47.807 ********** 2026-04-20 00:51:35.667806 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.667811 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.667815 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.667819 | orchestrator | 2026-04-20 00:51:35.667827 | orchestrator | TASK [k3s_server : Detect Kubernetes version for label compatibility] ********** 2026-04-20 00:51:35.667832 | orchestrator | Monday 20 April 2026 00:47:52 +0000 (0:00:03.161) 0:00:50.969 ********** 2026-04-20 00:51:35.667836 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667840 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667845 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667849 | orchestrator | 2026-04-20 00:51:35.667853 | orchestrator | TASK [k3s_server : Set node role label selector based on Kubernetes version] *** 2026-04-20 00:51:35.667858 | orchestrator | Monday 20 April 2026 00:47:55 +0000 (0:00:02.082) 0:00:53.051 ********** 2026-04-20 00:51:35.667862 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667866 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667870 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667875 | orchestrator | 2026-04-20 00:51:35.667879 | orchestrator | TASK [k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails)] *** 2026-04-20 00:51:35.667883 | orchestrator | Monday 20 April 2026 00:47:55 +0000 (0:00:00.728) 0:00:53.780 ********** 2026-04-20 00:51:35.667888 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-04-20 00:51:35.667893 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-04-20 00:51:35.667897 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2026-04-20 00:51:35.667901 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-04-20 00:51:35.667906 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-04-20 00:51:35.667910 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2026-04-20 00:51:35.667918 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-04-20 00:51:35.667922 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-04-20 00:51:35.667926 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2026-04-20 00:51:35.667931 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-04-20 00:51:35.667935 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-04-20 00:51:35.667939 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2026-04-20 00:51:35.667943 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (16 retries left). 2026-04-20 00:51:35.667948 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (16 retries left). 2026-04-20 00:51:35.667952 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (16 retries left). 2026-04-20 00:51:35.667956 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.667961 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.667965 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.667969 | orchestrator | 2026-04-20 00:51:35.667973 | orchestrator | TASK [k3s_server : Save logs of k3s-init.service] ****************************** 2026-04-20 00:51:35.667978 | orchestrator | Monday 20 April 2026 00:48:49 +0000 (0:00:53.990) 0:01:47.770 ********** 2026-04-20 00:51:35.667982 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.667986 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.667990 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.667995 | orchestrator | 2026-04-20 00:51:35.667999 | orchestrator | TASK [k3s_server : Kill the temporary service used for initialization] ********* 2026-04-20 00:51:35.668006 | orchestrator | Monday 20 April 2026 00:48:50 +0000 (0:00:00.471) 0:01:48.242 ********** 2026-04-20 00:51:35.668011 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668015 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668019 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668023 | orchestrator | 2026-04-20 00:51:35.668028 | orchestrator | TASK [k3s_server : Copy K3s service file] ************************************** 2026-04-20 00:51:35.668032 | orchestrator | Monday 20 April 2026 00:48:51 +0000 (0:00:01.578) 0:01:49.820 ********** 2026-04-20 00:51:35.668036 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668040 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668044 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668049 | orchestrator | 2026-04-20 00:51:35.668053 | orchestrator | TASK [k3s_server : Enable and check K3s service] ******************************* 2026-04-20 00:51:35.668057 | orchestrator | Monday 20 April 2026 00:48:53 +0000 (0:00:01.206) 0:01:51.027 ********** 2026-04-20 00:51:35.668061 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668066 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668070 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668074 | orchestrator | 2026-04-20 00:51:35.668081 | orchestrator | TASK [k3s_server : Wait for node-token] **************************************** 2026-04-20 00:51:35.668085 | orchestrator | Monday 20 April 2026 00:49:17 +0000 (0:00:24.818) 0:02:15.845 ********** 2026-04-20 00:51:35.668090 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.668094 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.668098 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.668102 | orchestrator | 2026-04-20 00:51:35.668107 | orchestrator | TASK [k3s_server : Register node-token file access mode] *********************** 2026-04-20 00:51:35.668115 | orchestrator | Monday 20 April 2026 00:49:18 +0000 (0:00:00.668) 0:02:16.514 ********** 2026-04-20 00:51:35.668119 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.668123 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.668128 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.668132 | orchestrator | 2026-04-20 00:51:35.668136 | orchestrator | TASK [k3s_server : Change file access node-token] ****************************** 2026-04-20 00:51:35.668140 | orchestrator | Monday 20 April 2026 00:49:19 +0000 (0:00:00.810) 0:02:17.325 ********** 2026-04-20 00:51:35.668145 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668149 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668153 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668157 | orchestrator | 2026-04-20 00:51:35.668161 | orchestrator | TASK [k3s_server : Read node-token from master] ******************************** 2026-04-20 00:51:35.668166 | orchestrator | Monday 20 April 2026 00:49:19 +0000 (0:00:00.645) 0:02:17.970 ********** 2026-04-20 00:51:35.668170 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.668174 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.668179 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.668183 | orchestrator | 2026-04-20 00:51:35.668187 | orchestrator | TASK [k3s_server : Store Master node-token] ************************************ 2026-04-20 00:51:35.668191 | orchestrator | Monday 20 April 2026 00:49:20 +0000 (0:00:00.706) 0:02:18.677 ********** 2026-04-20 00:51:35.668196 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.668200 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.668204 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.668208 | orchestrator | 2026-04-20 00:51:35.668213 | orchestrator | TASK [k3s_server : Restore node-token file access] ***************************** 2026-04-20 00:51:35.668217 | orchestrator | Monday 20 April 2026 00:49:20 +0000 (0:00:00.300) 0:02:18.978 ********** 2026-04-20 00:51:35.668221 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668225 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668230 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668234 | orchestrator | 2026-04-20 00:51:35.668238 | orchestrator | TASK [k3s_server : Create directory .kube] ************************************* 2026-04-20 00:51:35.668242 | orchestrator | Monday 20 April 2026 00:49:21 +0000 (0:00:00.925) 0:02:19.904 ********** 2026-04-20 00:51:35.668247 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668251 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668255 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668259 | orchestrator | 2026-04-20 00:51:35.668263 | orchestrator | TASK [k3s_server : Copy config file to user home directory] ******************** 2026-04-20 00:51:35.668268 | orchestrator | Monday 20 April 2026 00:49:22 +0000 (0:00:00.762) 0:02:20.667 ********** 2026-04-20 00:51:35.668272 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668276 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668281 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668285 | orchestrator | 2026-04-20 00:51:35.668289 | orchestrator | TASK [k3s_server : Configure kubectl cluster to https://192.168.16.8:6443] ***** 2026-04-20 00:51:35.668293 | orchestrator | Monday 20 April 2026 00:49:23 +0000 (0:00:00.929) 0:02:21.596 ********** 2026-04-20 00:51:35.668298 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:51:35.668302 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:51:35.668306 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:51:35.668310 | orchestrator | 2026-04-20 00:51:35.668315 | orchestrator | TASK [k3s_server : Create kubectl symlink] ************************************* 2026-04-20 00:51:35.668319 | orchestrator | Monday 20 April 2026 00:49:24 +0000 (0:00:00.980) 0:02:22.576 ********** 2026-04-20 00:51:35.668323 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.668327 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.668332 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.668336 | orchestrator | 2026-04-20 00:51:35.668340 | orchestrator | TASK [k3s_server : Create crictl symlink] ************************************** 2026-04-20 00:51:35.668344 | orchestrator | Monday 20 April 2026 00:49:24 +0000 (0:00:00.278) 0:02:22.855 ********** 2026-04-20 00:51:35.668352 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.668356 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.668363 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.668370 | orchestrator | 2026-04-20 00:51:35.668377 | orchestrator | TASK [k3s_server : Get contents of manifests folder] *************************** 2026-04-20 00:51:35.668384 | orchestrator | Monday 20 April 2026 00:49:25 +0000 (0:00:00.516) 0:02:23.371 ********** 2026-04-20 00:51:35.668392 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.668399 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.668405 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.668413 | orchestrator | 2026-04-20 00:51:35.668419 | orchestrator | TASK [k3s_server : Get sub dirs of manifests folder] *************************** 2026-04-20 00:51:35.668426 | orchestrator | Monday 20 April 2026 00:49:26 +0000 (0:00:00.689) 0:02:24.061 ********** 2026-04-20 00:51:35.668433 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.668445 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.668451 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.668459 | orchestrator | 2026-04-20 00:51:35.668465 | orchestrator | TASK [k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start] *** 2026-04-20 00:51:35.668472 | orchestrator | Monday 20 April 2026 00:49:26 +0000 (0:00:00.737) 0:02:24.798 ********** 2026-04-20 00:51:35.668479 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-04-20 00:51:35.668486 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-04-20 00:51:35.668494 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2026-04-20 00:51:35.668501 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-04-20 00:51:35.668511 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-04-20 00:51:35.668518 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2026-04-20 00:51:35.668558 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-04-20 00:51:35.668568 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-04-20 00:51:35.668575 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2026-04-20 00:51:35.668581 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-04-20 00:51:35.668588 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-04-20 00:51:35.668596 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip.yaml) 2026-04-20 00:51:35.668603 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-04-20 00:51:35.668610 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-04-20 00:51:35.668618 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip-rbac.yaml) 2026-04-20 00:51:35.668624 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-04-20 00:51:35.668629 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-04-20 00:51:35.668633 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2026-04-20 00:51:35.668638 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2026-04-20 00:51:35.668642 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2026-04-20 00:51:35.668647 | orchestrator | 2026-04-20 00:51:35.668651 | orchestrator | PLAY [Deploy k3s worker nodes] ************************************************* 2026-04-20 00:51:35.668665 | orchestrator | 2026-04-20 00:51:35.668670 | orchestrator | TASK [k3s_agent : Validating arguments against arg spec 'main' - Setup k3s agents] *** 2026-04-20 00:51:35.668674 | orchestrator | Monday 20 April 2026 00:49:30 +0000 (0:00:03.880) 0:02:28.679 ********** 2026-04-20 00:51:35.668679 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:51:35.668683 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:51:35.668687 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:51:35.668691 | orchestrator | 2026-04-20 00:51:35.668696 | orchestrator | TASK [k3s_agent : Check if system is PXE-booted] ******************************* 2026-04-20 00:51:35.668700 | orchestrator | Monday 20 April 2026 00:49:31 +0000 (0:00:00.355) 0:02:29.035 ********** 2026-04-20 00:51:35.668704 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:51:35.668708 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:51:35.668713 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:51:35.668717 | orchestrator | 2026-04-20 00:51:35.668721 | orchestrator | TASK [k3s_agent : Set fact for PXE-booted system] ****************************** 2026-04-20 00:51:35.668725 | orchestrator | Monday 20 April 2026 00:49:31 +0000 (0:00:00.675) 0:02:29.710 ********** 2026-04-20 00:51:35.668730 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:51:35.668734 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:51:35.668738 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:51:35.668742 | orchestrator | 2026-04-20 00:51:35.668747 | orchestrator | TASK [k3s_agent : Include http_proxy configuration tasks] ********************** 2026-04-20 00:51:35.668751 | orchestrator | Monday 20 April 2026 00:49:32 +0000 (0:00:00.311) 0:02:30.021 ********** 2026-04-20 00:51:35.668755 | orchestrator | included: /ansible/roles/k3s_agent/tasks/http_proxy.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:51:35.668760 | orchestrator | 2026-04-20 00:51:35.668764 | orchestrator | TASK [k3s_agent : Create k3s-node.service.d directory] ************************* 2026-04-20 00:51:35.668769 | orchestrator | Monday 20 April 2026 00:49:32 +0000 (0:00:00.681) 0:02:30.703 ********** 2026-04-20 00:51:35.668773 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.668777 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.668781 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.668786 | orchestrator | 2026-04-20 00:51:35.668790 | orchestrator | TASK [k3s_agent : Copy K3s http_proxy conf file] ******************************* 2026-04-20 00:51:35.668794 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:00.296) 0:02:30.999 ********** 2026-04-20 00:51:35.668798 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.668803 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.668807 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.668811 | orchestrator | 2026-04-20 00:51:35.668815 | orchestrator | TASK [k3s_agent : Deploy K3s http_proxy conf] ********************************** 2026-04-20 00:51:35.668824 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:00.304) 0:02:31.303 ********** 2026-04-20 00:51:35.668828 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.668833 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.668837 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.668841 | orchestrator | 2026-04-20 00:51:35.668846 | orchestrator | TASK [k3s_agent : Create /etc/rancher/k3s directory] *************************** 2026-04-20 00:51:35.668850 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:00.567) 0:02:31.871 ********** 2026-04-20 00:51:35.668854 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.668858 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.668863 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.668867 | orchestrator | 2026-04-20 00:51:35.668871 | orchestrator | TASK [k3s_agent : Create custom resolv.conf for k3s] *************************** 2026-04-20 00:51:35.668875 | orchestrator | Monday 20 April 2026 00:49:34 +0000 (0:00:00.813) 0:02:32.685 ********** 2026-04-20 00:51:35.668880 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.668884 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.668888 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.668892 | orchestrator | 2026-04-20 00:51:35.668900 | orchestrator | TASK [k3s_agent : Configure the k3s service] *********************************** 2026-04-20 00:51:35.668908 | orchestrator | Monday 20 April 2026 00:49:36 +0000 (0:00:01.643) 0:02:34.329 ********** 2026-04-20 00:51:35.668913 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.668917 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.668921 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.668925 | orchestrator | 2026-04-20 00:51:35.668930 | orchestrator | TASK [k3s_agent : Manage k3s service] ****************************************** 2026-04-20 00:51:35.668934 | orchestrator | Monday 20 April 2026 00:49:37 +0000 (0:00:01.655) 0:02:35.984 ********** 2026-04-20 00:51:35.668938 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:51:35.668942 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:51:35.668947 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:51:35.668951 | orchestrator | 2026-04-20 00:51:35.668955 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2026-04-20 00:51:35.668959 | orchestrator | 2026-04-20 00:51:35.668964 | orchestrator | TASK [Get home directory of operator user] ************************************* 2026-04-20 00:51:35.668968 | orchestrator | Monday 20 April 2026 00:49:47 +0000 (0:00:10.001) 0:02:45.985 ********** 2026-04-20 00:51:35.668972 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.668977 | orchestrator | 2026-04-20 00:51:35.668981 | orchestrator | TASK [Create .kube directory] ************************************************** 2026-04-20 00:51:35.668985 | orchestrator | Monday 20 April 2026 00:49:48 +0000 (0:00:00.930) 0:02:46.916 ********** 2026-04-20 00:51:35.668989 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.668994 | orchestrator | 2026-04-20 00:51:35.668998 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-04-20 00:51:35.669002 | orchestrator | Monday 20 April 2026 00:49:49 +0000 (0:00:00.485) 0:02:47.402 ********** 2026-04-20 00:51:35.669006 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-04-20 00:51:35.669011 | orchestrator | 2026-04-20 00:51:35.669015 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-04-20 00:51:35.669019 | orchestrator | Monday 20 April 2026 00:49:49 +0000 (0:00:00.571) 0:02:47.974 ********** 2026-04-20 00:51:35.669023 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669028 | orchestrator | 2026-04-20 00:51:35.669032 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2026-04-20 00:51:35.669036 | orchestrator | Monday 20 April 2026 00:49:50 +0000 (0:00:00.850) 0:02:48.824 ********** 2026-04-20 00:51:35.669040 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669045 | orchestrator | 2026-04-20 00:51:35.669049 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2026-04-20 00:51:35.669053 | orchestrator | Monday 20 April 2026 00:49:51 +0000 (0:00:00.599) 0:02:49.424 ********** 2026-04-20 00:51:35.669058 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-20 00:51:35.669062 | orchestrator | 2026-04-20 00:51:35.669066 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2026-04-20 00:51:35.669070 | orchestrator | Monday 20 April 2026 00:49:53 +0000 (0:00:01.969) 0:02:51.394 ********** 2026-04-20 00:51:35.669075 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-20 00:51:35.669079 | orchestrator | 2026-04-20 00:51:35.669083 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2026-04-20 00:51:35.669087 | orchestrator | Monday 20 April 2026 00:49:54 +0000 (0:00:00.952) 0:02:52.347 ********** 2026-04-20 00:51:35.669092 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669096 | orchestrator | 2026-04-20 00:51:35.669100 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2026-04-20 00:51:35.669104 | orchestrator | Monday 20 April 2026 00:49:54 +0000 (0:00:00.462) 0:02:52.809 ********** 2026-04-20 00:51:35.669109 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669113 | orchestrator | 2026-04-20 00:51:35.669117 | orchestrator | PLAY [Apply role kubectl] ****************************************************** 2026-04-20 00:51:35.669121 | orchestrator | 2026-04-20 00:51:35.669126 | orchestrator | TASK [kubectl : Gather variables for each operating system] ******************** 2026-04-20 00:51:35.669135 | orchestrator | Monday 20 April 2026 00:49:55 +0000 (0:00:00.418) 0:02:53.228 ********** 2026-04-20 00:51:35.669139 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.669143 | orchestrator | 2026-04-20 00:51:35.669148 | orchestrator | TASK [kubectl : Include distribution specific install tasks] ******************* 2026-04-20 00:51:35.669152 | orchestrator | Monday 20 April 2026 00:49:55 +0000 (0:00:00.137) 0:02:53.365 ********** 2026-04-20 00:51:35.669156 | orchestrator | included: /ansible/roles/kubectl/tasks/install-Debian-family.yml for testbed-manager 2026-04-20 00:51:35.669160 | orchestrator | 2026-04-20 00:51:35.669165 | orchestrator | TASK [kubectl : Remove old architecture-dependent repository] ****************** 2026-04-20 00:51:35.669169 | orchestrator | Monday 20 April 2026 00:49:55 +0000 (0:00:00.226) 0:02:53.592 ********** 2026-04-20 00:51:35.669173 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.669178 | orchestrator | 2026-04-20 00:51:35.669182 | orchestrator | TASK [kubectl : Install apt-transport-https package] *************************** 2026-04-20 00:51:35.669186 | orchestrator | Monday 20 April 2026 00:49:56 +0000 (0:00:00.765) 0:02:54.358 ********** 2026-04-20 00:51:35.669193 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.669198 | orchestrator | 2026-04-20 00:51:35.669202 | orchestrator | TASK [kubectl : Add repository gpg key] **************************************** 2026-04-20 00:51:35.669206 | orchestrator | Monday 20 April 2026 00:49:57 +0000 (0:00:01.295) 0:02:55.654 ********** 2026-04-20 00:51:35.669211 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669215 | orchestrator | 2026-04-20 00:51:35.669219 | orchestrator | TASK [kubectl : Set permissions of gpg key] ************************************ 2026-04-20 00:51:35.669224 | orchestrator | Monday 20 April 2026 00:49:58 +0000 (0:00:00.781) 0:02:56.435 ********** 2026-04-20 00:51:35.669228 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.669232 | orchestrator | 2026-04-20 00:51:35.669236 | orchestrator | TASK [kubectl : Add repository Debian] ***************************************** 2026-04-20 00:51:35.669241 | orchestrator | Monday 20 April 2026 00:49:58 +0000 (0:00:00.429) 0:02:56.865 ********** 2026-04-20 00:51:35.669245 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669249 | orchestrator | 2026-04-20 00:51:35.669253 | orchestrator | TASK [kubectl : Install required packages] ************************************* 2026-04-20 00:51:35.669261 | orchestrator | Monday 20 April 2026 00:50:06 +0000 (0:00:07.732) 0:03:04.598 ********** 2026-04-20 00:51:35.669265 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669269 | orchestrator | 2026-04-20 00:51:35.669273 | orchestrator | TASK [kubectl : Remove kubectl symlink] **************************************** 2026-04-20 00:51:35.669278 | orchestrator | Monday 20 April 2026 00:50:18 +0000 (0:00:11.906) 0:03:16.505 ********** 2026-04-20 00:51:35.669282 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.669286 | orchestrator | 2026-04-20 00:51:35.669290 | orchestrator | PLAY [Run post actions on master nodes] **************************************** 2026-04-20 00:51:35.669295 | orchestrator | 2026-04-20 00:51:35.669299 | orchestrator | TASK [k3s_server_post : Validating arguments against arg spec 'main' - Configure k3s cluster] *** 2026-04-20 00:51:35.669303 | orchestrator | Monday 20 April 2026 00:50:18 +0000 (0:00:00.474) 0:03:16.980 ********** 2026-04-20 00:51:35.669307 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.669312 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.669316 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.669320 | orchestrator | 2026-04-20 00:51:35.669324 | orchestrator | TASK [k3s_server_post : Deploy calico] ***************************************** 2026-04-20 00:51:35.669329 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:00.244) 0:03:17.224 ********** 2026-04-20 00:51:35.669333 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669337 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.669341 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.669346 | orchestrator | 2026-04-20 00:51:35.669350 | orchestrator | TASK [k3s_server_post : Deploy cilium] ***************************************** 2026-04-20 00:51:35.669354 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:00.392) 0:03:17.617 ********** 2026-04-20 00:51:35.669358 | orchestrator | included: /ansible/roles/k3s_server_post/tasks/cilium.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:51:35.669367 | orchestrator | 2026-04-20 00:51:35.669371 | orchestrator | TASK [k3s_server_post : Create tmp directory on first master] ****************** 2026-04-20 00:51:35.669375 | orchestrator | Monday 20 April 2026 00:50:20 +0000 (0:00:00.483) 0:03:18.101 ********** 2026-04-20 00:51:35.669380 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669384 | orchestrator | 2026-04-20 00:51:35.669388 | orchestrator | TASK [k3s_server_post : Wait for connectivity to kube VIP] ********************* 2026-04-20 00:51:35.669392 | orchestrator | Monday 20 April 2026 00:50:21 +0000 (0:00:00.912) 0:03:19.013 ********** 2026-04-20 00:51:35.669397 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669401 | orchestrator | 2026-04-20 00:51:35.669405 | orchestrator | TASK [k3s_server_post : Fail if kube VIP not reachable] ************************ 2026-04-20 00:51:35.669409 | orchestrator | Monday 20 April 2026 00:50:21 +0000 (0:00:00.868) 0:03:19.881 ********** 2026-04-20 00:51:35.669414 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669418 | orchestrator | 2026-04-20 00:51:35.669422 | orchestrator | TASK [k3s_server_post : Test for existing Cilium install] ********************** 2026-04-20 00:51:35.669426 | orchestrator | Monday 20 April 2026 00:50:22 +0000 (0:00:00.120) 0:03:20.001 ********** 2026-04-20 00:51:35.669431 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669435 | orchestrator | 2026-04-20 00:51:35.669439 | orchestrator | TASK [k3s_server_post : Check Cilium version] ********************************** 2026-04-20 00:51:35.669443 | orchestrator | Monday 20 April 2026 00:50:22 +0000 (0:00:00.923) 0:03:20.925 ********** 2026-04-20 00:51:35.669448 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669452 | orchestrator | 2026-04-20 00:51:35.669456 | orchestrator | TASK [k3s_server_post : Parse installed Cilium version] ************************ 2026-04-20 00:51:35.669460 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:00.093) 0:03:21.019 ********** 2026-04-20 00:51:35.669465 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669469 | orchestrator | 2026-04-20 00:51:35.669473 | orchestrator | TASK [k3s_server_post : Determine if Cilium needs update] ********************** 2026-04-20 00:51:35.669478 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:00.237) 0:03:21.257 ********** 2026-04-20 00:51:35.669482 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669486 | orchestrator | 2026-04-20 00:51:35.669490 | orchestrator | TASK [k3s_server_post : Log result] ******************************************** 2026-04-20 00:51:35.669495 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:00.112) 0:03:21.369 ********** 2026-04-20 00:51:35.669499 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669503 | orchestrator | 2026-04-20 00:51:35.669507 | orchestrator | TASK [k3s_server_post : Install Cilium] **************************************** 2026-04-20 00:51:35.669512 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:00.105) 0:03:21.475 ********** 2026-04-20 00:51:35.669516 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669520 | orchestrator | 2026-04-20 00:51:35.669557 | orchestrator | TASK [k3s_server_post : Wait for Cilium resources] ***************************** 2026-04-20 00:51:35.669563 | orchestrator | Monday 20 April 2026 00:50:27 +0000 (0:00:04.394) 0:03:25.870 ********** 2026-04-20 00:51:35.669567 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/cilium-operator) 2026-04-20 00:51:35.669575 | orchestrator | FAILED - RETRYING: [testbed-node-0 -> localhost]: Wait for Cilium res2026-04-20 00:51:35 | INFO  | Task ef57e714-62b4-4974-b840-da895f71622d is in state SUCCESS 2026-04-20 00:51:35.669581 | orchestrator | ources (30 retries left). 2026-04-20 00:51:35.669586 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=daemonset/cilium) 2026-04-20 00:51:35.669590 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/hubble-relay) 2026-04-20 00:51:35.669594 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=deployment/hubble-ui) 2026-04-20 00:51:35.669599 | orchestrator | 2026-04-20 00:51:35.669603 | orchestrator | TASK [k3s_server_post : Set _cilium_bgp_neighbors fact] ************************ 2026-04-20 00:51:35.669607 | orchestrator | Monday 20 April 2026 00:51:10 +0000 (0:00:42.358) 0:04:08.228 ********** 2026-04-20 00:51:35.669615 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669620 | orchestrator | 2026-04-20 00:51:35.669624 | orchestrator | TASK [k3s_server_post : Copy BGP manifests to first master] ******************** 2026-04-20 00:51:35.669632 | orchestrator | Monday 20 April 2026 00:51:11 +0000 (0:00:01.403) 0:04:09.632 ********** 2026-04-20 00:51:35.669636 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669641 | orchestrator | 2026-04-20 00:51:35.669645 | orchestrator | TASK [k3s_server_post : Apply BGP manifests] *********************************** 2026-04-20 00:51:35.669649 | orchestrator | Monday 20 April 2026 00:51:13 +0000 (0:00:01.712) 0:04:11.344 ********** 2026-04-20 00:51:35.669653 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-20 00:51:35.669658 | orchestrator | 2026-04-20 00:51:35.669662 | orchestrator | TASK [k3s_server_post : Print error message if BGP manifests application fails] *** 2026-04-20 00:51:35.669666 | orchestrator | Monday 20 April 2026 00:51:14 +0000 (0:00:01.096) 0:04:12.441 ********** 2026-04-20 00:51:35.669670 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669675 | orchestrator | 2026-04-20 00:51:35.669679 | orchestrator | TASK [k3s_server_post : Test for BGP config resources] ************************* 2026-04-20 00:51:35.669683 | orchestrator | Monday 20 April 2026 00:51:14 +0000 (0:00:00.114) 0:04:12.555 ********** 2026-04-20 00:51:35.669687 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=kubectl get CiliumBGPPeeringPolicy.cilium.io) 2026-04-20 00:51:35.669692 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=kubectl get CiliumLoadBalancerIPPool.cilium.io) 2026-04-20 00:51:35.669696 | orchestrator | 2026-04-20 00:51:35.669700 | orchestrator | TASK [k3s_server_post : Deploy metallb pool] *********************************** 2026-04-20 00:51:35.669705 | orchestrator | Monday 20 April 2026 00:51:16 +0000 (0:00:01.970) 0:04:14.526 ********** 2026-04-20 00:51:35.669709 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.669713 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.669718 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.669722 | orchestrator | 2026-04-20 00:51:35.669726 | orchestrator | TASK [k3s_server_post : Remove tmp directory used for manifests] *************** 2026-04-20 00:51:35.669730 | orchestrator | Monday 20 April 2026 00:51:16 +0000 (0:00:00.273) 0:04:14.799 ********** 2026-04-20 00:51:35.669735 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.669739 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.669743 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.669747 | orchestrator | 2026-04-20 00:51:35.669752 | orchestrator | PLAY [Apply role k9s] ********************************************************** 2026-04-20 00:51:35.669756 | orchestrator | 2026-04-20 00:51:35.669760 | orchestrator | TASK [k9s : Gather variables for each operating system] ************************ 2026-04-20 00:51:35.669765 | orchestrator | Monday 20 April 2026 00:51:17 +0000 (0:00:00.988) 0:04:15.788 ********** 2026-04-20 00:51:35.669769 | orchestrator | ok: [testbed-manager] 2026-04-20 00:51:35.669773 | orchestrator | 2026-04-20 00:51:35.669778 | orchestrator | TASK [k9s : Include distribution specific install tasks] *********************** 2026-04-20 00:51:35.669782 | orchestrator | Monday 20 April 2026 00:51:17 +0000 (0:00:00.128) 0:04:15.917 ********** 2026-04-20 00:51:35.669786 | orchestrator | included: /ansible/roles/k9s/tasks/install-Debian-family.yml for testbed-manager 2026-04-20 00:51:35.669790 | orchestrator | 2026-04-20 00:51:35.669795 | orchestrator | TASK [k9s : Install k9s packages] ********************************************** 2026-04-20 00:51:35.669799 | orchestrator | Monday 20 April 2026 00:51:18 +0000 (0:00:00.221) 0:04:16.138 ********** 2026-04-20 00:51:35.669803 | orchestrator | changed: [testbed-manager] 2026-04-20 00:51:35.669808 | orchestrator | 2026-04-20 00:51:35.669812 | orchestrator | PLAY [Manage labels, annotations, and taints on all k3s nodes] ***************** 2026-04-20 00:51:35.669816 | orchestrator | 2026-04-20 00:51:35.669821 | orchestrator | TASK [Merge labels, annotations, and taints] *********************************** 2026-04-20 00:51:35.669825 | orchestrator | Monday 20 April 2026 00:51:23 +0000 (0:00:04.958) 0:04:21.096 ********** 2026-04-20 00:51:35.669829 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:51:35.669838 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:51:35.669842 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:51:35.669846 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:51:35.669851 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:51:35.669855 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:51:35.669859 | orchestrator | 2026-04-20 00:51:35.669864 | orchestrator | TASK [Manage labels] *********************************************************** 2026-04-20 00:51:35.669868 | orchestrator | Monday 20 April 2026 00:51:23 +0000 (0:00:00.513) 0:04:21.610 ********** 2026-04-20 00:51:35.669872 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-04-20 00:51:35.669876 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-04-20 00:51:35.669881 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-04-20 00:51:35.669885 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-04-20 00:51:35.669889 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2026-04-20 00:51:35.669893 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2026-04-20 00:51:35.669901 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=openstack-control-plane=enabled) 2026-04-20 00:51:35.669905 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-04-20 00:51:35.669910 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=openstack-control-plane=enabled) 2026-04-20 00:51:35.669914 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-04-20 00:51:35.669918 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=openstack-control-plane=enabled) 2026-04-20 00:51:35.669922 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2026-04-20 00:51:35.669927 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-04-20 00:51:35.669931 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-04-20 00:51:35.669939 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-04-20 00:51:35.669943 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-04-20 00:51:35.669948 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2026-04-20 00:51:35.669952 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-04-20 00:51:35.669956 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2026-04-20 00:51:35.669960 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-04-20 00:51:35.669965 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-04-20 00:51:35.669969 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2026-04-20 00:51:35.669973 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-04-20 00:51:35.669978 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-04-20 00:51:35.669982 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2026-04-20 00:51:35.669986 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-04-20 00:51:35.669990 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-04-20 00:51:35.669995 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2026-04-20 00:51:35.669999 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-04-20 00:51:35.670003 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2026-04-20 00:51:35.670037 | orchestrator | 2026-04-20 00:51:35.670043 | orchestrator | TASK [Manage annotations] ****************************************************** 2026-04-20 00:51:35.670048 | orchestrator | Monday 20 April 2026 00:51:34 +0000 (0:00:10.454) 0:04:32.064 ********** 2026-04-20 00:51:35.670052 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.670056 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.670061 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.670065 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.670069 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.670074 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.670078 | orchestrator | 2026-04-20 00:51:35.670082 | orchestrator | TASK [Manage taints] *********************************************************** 2026-04-20 00:51:35.670087 | orchestrator | Monday 20 April 2026 00:51:34 +0000 (0:00:00.396) 0:04:32.461 ********** 2026-04-20 00:51:35.670091 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:51:35.670095 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:51:35.670100 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:51:35.670104 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:51:35.670107 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:51:35.670111 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:51:35.670115 | orchestrator | 2026-04-20 00:51:35.670119 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:51:35.670123 | orchestrator | testbed-manager : ok=21  changed=11  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:51:35.670128 | orchestrator | testbed-node-0 : ok=50  changed=23  unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2026-04-20 00:51:35.670133 | orchestrator | testbed-node-1 : ok=38  changed=16  unreachable=0 failed=0 skipped=25  rescued=0 ignored=0 2026-04-20 00:51:35.670137 | orchestrator | testbed-node-2 : ok=38  changed=16  unreachable=0 failed=0 skipped=25  rescued=0 ignored=0 2026-04-20 00:51:35.670141 | orchestrator | testbed-node-3 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-04-20 00:51:35.670145 | orchestrator | testbed-node-4 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-04-20 00:51:35.670149 | orchestrator | testbed-node-5 : ok=16  changed=8  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2026-04-20 00:51:35.670153 | orchestrator | 2026-04-20 00:51:35.670157 | orchestrator | 2026-04-20 00:51:35.670165 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:51:35.670169 | orchestrator | Monday 20 April 2026 00:51:34 +0000 (0:00:00.345) 0:04:32.807 ********** 2026-04-20 00:51:35.670173 | orchestrator | =============================================================================== 2026-04-20 00:51:35.670177 | orchestrator | k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails) -- 53.99s 2026-04-20 00:51:35.670181 | orchestrator | k3s_server_post : Wait for Cilium resources ---------------------------- 42.36s 2026-04-20 00:51:35.670185 | orchestrator | k3s_server : Enable and check K3s service ------------------------------ 24.82s 2026-04-20 00:51:35.670188 | orchestrator | kubectl : Install required packages ------------------------------------ 11.91s 2026-04-20 00:51:35.670192 | orchestrator | Manage labels ---------------------------------------------------------- 10.45s 2026-04-20 00:51:35.670199 | orchestrator | k3s_agent : Manage k3s service ----------------------------------------- 10.00s 2026-04-20 00:51:35.670203 | orchestrator | kubectl : Add repository Debian ----------------------------------------- 7.73s 2026-04-20 00:51:35.670207 | orchestrator | k3s_download : Download k3s binary x64 ---------------------------------- 5.86s 2026-04-20 00:51:35.670215 | orchestrator | k9s : Install k9s packages ---------------------------------------------- 4.96s 2026-04-20 00:51:35.670218 | orchestrator | k3s_server_post : Install Cilium ---------------------------------------- 4.39s 2026-04-20 00:51:35.670222 | orchestrator | k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start --- 3.88s 2026-04-20 00:51:35.670226 | orchestrator | k3s_server : Set _kube_vip_bgp_peers fact ------------------------------- 3.65s 2026-04-20 00:51:35.670230 | orchestrator | k3s_download : Download k3s binary armhf -------------------------------- 3.52s 2026-04-20 00:51:35.670234 | orchestrator | k3s_server : Init cluster inside the transient k3s-init service --------- 3.16s 2026-04-20 00:51:35.670238 | orchestrator | k3s_custom_registries : Validating arguments against arg spec 'main' - Configure the use of a custom container registry --- 2.78s 2026-04-20 00:51:35.670242 | orchestrator | k3s_server : Detect Kubernetes version for label compatibility ---------- 2.08s 2026-04-20 00:51:35.670246 | orchestrator | k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers --- 2.01s 2026-04-20 00:51:35.670249 | orchestrator | k3s_server_post : Test for BGP config resources ------------------------- 1.97s 2026-04-20 00:51:35.670253 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.97s 2026-04-20 00:51:35.670257 | orchestrator | k3s_prereq : Enable IPv6 router advertisements -------------------------- 1.80s 2026-04-20 00:51:35.670261 | orchestrator | 2026-04-20 00:51:35 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:35.670265 | orchestrator | 2026-04-20 00:51:35 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:35.670269 | orchestrator | 2026-04-20 00:51:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:38.707093 | orchestrator | 2026-04-20 00:51:38 | INFO  | Task e50b9dfe-99ee-4f5d-840b-b48b523f36a3 is in state STARTED 2026-04-20 00:51:38.707180 | orchestrator | 2026-04-20 00:51:38 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:38.707190 | orchestrator | 2026-04-20 00:51:38 | INFO  | Task 55e205a6-3d90-4146-aae3-b2430f7115ca is in state STARTED 2026-04-20 00:51:38.707940 | orchestrator | 2026-04-20 00:51:38 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:38.707990 | orchestrator | 2026-04-20 00:51:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:41.747315 | orchestrator | 2026-04-20 00:51:41 | INFO  | Task e50b9dfe-99ee-4f5d-840b-b48b523f36a3 is in state SUCCESS 2026-04-20 00:51:41.747404 | orchestrator | 2026-04-20 00:51:41 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:41.747713 | orchestrator | 2026-04-20 00:51:41 | INFO  | Task 55e205a6-3d90-4146-aae3-b2430f7115ca is in state STARTED 2026-04-20 00:51:41.748467 | orchestrator | 2026-04-20 00:51:41 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:41.748496 | orchestrator | 2026-04-20 00:51:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:44.788719 | orchestrator | 2026-04-20 00:51:44 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:44.789072 | orchestrator | 2026-04-20 00:51:44 | INFO  | Task 55e205a6-3d90-4146-aae3-b2430f7115ca is in state STARTED 2026-04-20 00:51:44.790554 | orchestrator | 2026-04-20 00:51:44 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:44.790802 | orchestrator | 2026-04-20 00:51:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:47.832691 | orchestrator | 2026-04-20 00:51:47 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:47.833102 | orchestrator | 2026-04-20 00:51:47 | INFO  | Task 55e205a6-3d90-4146-aae3-b2430f7115ca is in state SUCCESS 2026-04-20 00:51:47.835798 | orchestrator | 2026-04-20 00:51:47 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:47.835864 | orchestrator | 2026-04-20 00:51:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:50.888383 | orchestrator | 2026-04-20 00:51:50 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:50.891018 | orchestrator | 2026-04-20 00:51:50 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:50.891097 | orchestrator | 2026-04-20 00:51:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:53.937274 | orchestrator | 2026-04-20 00:51:53 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:53.939414 | orchestrator | 2026-04-20 00:51:53 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:53.939481 | orchestrator | 2026-04-20 00:51:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:51:56.979350 | orchestrator | 2026-04-20 00:51:56 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:51:56.981369 | orchestrator | 2026-04-20 00:51:56 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:51:56.982060 | orchestrator | 2026-04-20 00:51:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:00.018115 | orchestrator | 2026-04-20 00:52:00 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:00.024161 | orchestrator | 2026-04-20 00:52:00 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:00.024271 | orchestrator | 2026-04-20 00:52:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:03.059233 | orchestrator | 2026-04-20 00:52:03 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:03.059680 | orchestrator | 2026-04-20 00:52:03 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:03.059770 | orchestrator | 2026-04-20 00:52:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:06.090175 | orchestrator | 2026-04-20 00:52:06 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:06.090294 | orchestrator | 2026-04-20 00:52:06 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:06.090307 | orchestrator | 2026-04-20 00:52:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:09.132600 | orchestrator | 2026-04-20 00:52:09 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:09.133773 | orchestrator | 2026-04-20 00:52:09 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:09.133891 | orchestrator | 2026-04-20 00:52:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:12.172687 | orchestrator | 2026-04-20 00:52:12 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:12.173062 | orchestrator | 2026-04-20 00:52:12 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:12.173093 | orchestrator | 2026-04-20 00:52:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:15.223848 | orchestrator | 2026-04-20 00:52:15 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:15.226212 | orchestrator | 2026-04-20 00:52:15 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:15.226260 | orchestrator | 2026-04-20 00:52:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:18.265636 | orchestrator | 2026-04-20 00:52:18 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:18.266323 | orchestrator | 2026-04-20 00:52:18 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:18.266350 | orchestrator | 2026-04-20 00:52:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:21.295619 | orchestrator | 2026-04-20 00:52:21 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:21.296839 | orchestrator | 2026-04-20 00:52:21 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:21.296894 | orchestrator | 2026-04-20 00:52:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:24.330078 | orchestrator | 2026-04-20 00:52:24 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:24.330358 | orchestrator | 2026-04-20 00:52:24 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:24.330380 | orchestrator | 2026-04-20 00:52:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:27.366695 | orchestrator | 2026-04-20 00:52:27 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:27.366798 | orchestrator | 2026-04-20 00:52:27 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:27.366812 | orchestrator | 2026-04-20 00:52:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:30.396727 | orchestrator | 2026-04-20 00:52:30 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:30.397599 | orchestrator | 2026-04-20 00:52:30 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:30.397651 | orchestrator | 2026-04-20 00:52:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:33.431741 | orchestrator | 2026-04-20 00:52:33 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:33.432424 | orchestrator | 2026-04-20 00:52:33 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:33.433397 | orchestrator | 2026-04-20 00:52:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:36.482942 | orchestrator | 2026-04-20 00:52:36 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:36.483637 | orchestrator | 2026-04-20 00:52:36 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:36.483733 | orchestrator | 2026-04-20 00:52:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:39.518300 | orchestrator | 2026-04-20 00:52:39 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:39.520630 | orchestrator | 2026-04-20 00:52:39 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:39.520689 | orchestrator | 2026-04-20 00:52:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:42.551541 | orchestrator | 2026-04-20 00:52:42 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:42.552587 | orchestrator | 2026-04-20 00:52:42 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:42.552752 | orchestrator | 2026-04-20 00:52:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:45.598336 | orchestrator | 2026-04-20 00:52:45 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:45.598923 | orchestrator | 2026-04-20 00:52:45 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:45.598989 | orchestrator | 2026-04-20 00:52:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:48.629192 | orchestrator | 2026-04-20 00:52:48 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:48.630735 | orchestrator | 2026-04-20 00:52:48 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:48.631004 | orchestrator | 2026-04-20 00:52:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:51.663438 | orchestrator | 2026-04-20 00:52:51 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:51.664689 | orchestrator | 2026-04-20 00:52:51 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:51.664733 | orchestrator | 2026-04-20 00:52:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:54.718290 | orchestrator | 2026-04-20 00:52:54 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:54.718371 | orchestrator | 2026-04-20 00:52:54 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:54.724238 | orchestrator | 2026-04-20 00:52:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:52:57.760372 | orchestrator | 2026-04-20 00:52:57 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:52:57.761235 | orchestrator | 2026-04-20 00:52:57 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:52:57.761269 | orchestrator | 2026-04-20 00:52:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:00.801633 | orchestrator | 2026-04-20 00:53:00 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:00.803641 | orchestrator | 2026-04-20 00:53:00 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:00.803679 | orchestrator | 2026-04-20 00:53:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:03.852105 | orchestrator | 2026-04-20 00:53:03 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:03.855542 | orchestrator | 2026-04-20 00:53:03 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:03.855618 | orchestrator | 2026-04-20 00:53:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:06.916434 | orchestrator | 2026-04-20 00:53:06 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:06.918079 | orchestrator | 2026-04-20 00:53:06 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:06.920058 | orchestrator | 2026-04-20 00:53:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:09.963850 | orchestrator | 2026-04-20 00:53:09 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:09.964165 | orchestrator | 2026-04-20 00:53:09 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:09.965126 | orchestrator | 2026-04-20 00:53:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:13.012803 | orchestrator | 2026-04-20 00:53:13 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:13.014574 | orchestrator | 2026-04-20 00:53:13 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:13.014652 | orchestrator | 2026-04-20 00:53:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:16.064449 | orchestrator | 2026-04-20 00:53:16 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:16.065386 | orchestrator | 2026-04-20 00:53:16 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:16.065411 | orchestrator | 2026-04-20 00:53:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:19.109549 | orchestrator | 2026-04-20 00:53:19 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:19.110556 | orchestrator | 2026-04-20 00:53:19 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:19.110616 | orchestrator | 2026-04-20 00:53:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:22.141741 | orchestrator | 2026-04-20 00:53:22 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:22.142214 | orchestrator | 2026-04-20 00:53:22 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:22.142258 | orchestrator | 2026-04-20 00:53:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:25.177681 | orchestrator | 2026-04-20 00:53:25 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:25.177803 | orchestrator | 2026-04-20 00:53:25 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:25.177813 | orchestrator | 2026-04-20 00:53:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:28.219132 | orchestrator | 2026-04-20 00:53:28 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:28.220682 | orchestrator | 2026-04-20 00:53:28 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:28.220735 | orchestrator | 2026-04-20 00:53:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:31.257257 | orchestrator | 2026-04-20 00:53:31 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:31.257351 | orchestrator | 2026-04-20 00:53:31 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:31.257364 | orchestrator | 2026-04-20 00:53:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:34.301943 | orchestrator | 2026-04-20 00:53:34 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:34.303734 | orchestrator | 2026-04-20 00:53:34 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:34.303969 | orchestrator | 2026-04-20 00:53:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:37.356223 | orchestrator | 2026-04-20 00:53:37 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:37.360570 | orchestrator | 2026-04-20 00:53:37 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:37.360732 | orchestrator | 2026-04-20 00:53:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:40.404286 | orchestrator | 2026-04-20 00:53:40 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:40.407687 | orchestrator | 2026-04-20 00:53:40 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:40.407756 | orchestrator | 2026-04-20 00:53:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:43.442872 | orchestrator | 2026-04-20 00:53:43 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:43.444617 | orchestrator | 2026-04-20 00:53:43 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:43.444696 | orchestrator | 2026-04-20 00:53:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:46.486080 | orchestrator | 2026-04-20 00:53:46 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:46.487903 | orchestrator | 2026-04-20 00:53:46 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state STARTED 2026-04-20 00:53:46.488528 | orchestrator | 2026-04-20 00:53:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:49.535735 | orchestrator | 2026-04-20 00:53:49 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:53:49.536855 | orchestrator | 2026-04-20 00:53:49 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:53:49.537178 | orchestrator | 2026-04-20 00:53:49 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:49.546570 | orchestrator | 2026-04-20 00:53:49 | INFO  | Task 338954ce-4261-4410-8652-c2a3514188f8 is in state SUCCESS 2026-04-20 00:53:49.547890 | orchestrator | 2026-04-20 00:53:49.548005 | orchestrator | 2026-04-20 00:53:49.548015 | orchestrator | PLAY [Copy kubeconfig to the configuration repository] ************************* 2026-04-20 00:53:49.548053 | orchestrator | 2026-04-20 00:53:49.548061 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-04-20 00:53:49.548108 | orchestrator | Monday 20 April 2026 00:51:38 +0000 (0:00:00.208) 0:00:00.208 ********** 2026-04-20 00:53:49.548116 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-04-20 00:53:49.548180 | orchestrator | 2026-04-20 00:53:49.548184 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-04-20 00:53:49.548188 | orchestrator | Monday 20 April 2026 00:51:39 +0000 (0:00:01.020) 0:00:01.229 ********** 2026-04-20 00:53:49.548192 | orchestrator | changed: [testbed-manager] 2026-04-20 00:53:49.548196 | orchestrator | 2026-04-20 00:53:49.548200 | orchestrator | TASK [Change server address in the kubeconfig file] **************************** 2026-04-20 00:53:49.548204 | orchestrator | Monday 20 April 2026 00:51:40 +0000 (0:00:01.438) 0:00:02.668 ********** 2026-04-20 00:53:49.548208 | orchestrator | changed: [testbed-manager] 2026-04-20 00:53:49.548211 | orchestrator | 2026-04-20 00:53:49.548215 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:53:49.548220 | orchestrator | testbed-manager : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:53:49.548224 | orchestrator | 2026-04-20 00:53:49.548228 | orchestrator | 2026-04-20 00:53:49.548232 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:53:49.548236 | orchestrator | Monday 20 April 2026 00:51:40 +0000 (0:00:00.455) 0:00:03.123 ********** 2026-04-20 00:53:49.548240 | orchestrator | =============================================================================== 2026-04-20 00:53:49.548243 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.44s 2026-04-20 00:53:49.548248 | orchestrator | Get kubeconfig file ----------------------------------------------------- 1.02s 2026-04-20 00:53:49.548251 | orchestrator | Change server address in the kubeconfig file ---------------------------- 0.46s 2026-04-20 00:53:49.548255 | orchestrator | 2026-04-20 00:53:49.548259 | orchestrator | 2026-04-20 00:53:49.548263 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2026-04-20 00:53:49.548266 | orchestrator | 2026-04-20 00:53:49.548270 | orchestrator | TASK [Get home directory of operator user] ************************************* 2026-04-20 00:53:49.548274 | orchestrator | Monday 20 April 2026 00:51:37 +0000 (0:00:00.183) 0:00:00.183 ********** 2026-04-20 00:53:49.548280 | orchestrator | ok: [testbed-manager] 2026-04-20 00:53:49.548287 | orchestrator | 2026-04-20 00:53:49.548297 | orchestrator | TASK [Create .kube directory] ************************************************** 2026-04-20 00:53:49.548304 | orchestrator | Monday 20 April 2026 00:51:38 +0000 (0:00:00.713) 0:00:00.896 ********** 2026-04-20 00:53:49.548310 | orchestrator | ok: [testbed-manager] 2026-04-20 00:53:49.548316 | orchestrator | 2026-04-20 00:53:49.548344 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2026-04-20 00:53:49.548351 | orchestrator | Monday 20 April 2026 00:51:39 +0000 (0:00:00.457) 0:00:01.353 ********** 2026-04-20 00:53:49.548357 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2026-04-20 00:53:49.548364 | orchestrator | 2026-04-20 00:53:49.548370 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2026-04-20 00:53:49.548377 | orchestrator | Monday 20 April 2026 00:51:39 +0000 (0:00:00.911) 0:00:02.264 ********** 2026-04-20 00:53:49.548381 | orchestrator | changed: [testbed-manager] 2026-04-20 00:53:49.548385 | orchestrator | 2026-04-20 00:53:49.548388 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2026-04-20 00:53:49.548392 | orchestrator | Monday 20 April 2026 00:51:41 +0000 (0:00:01.142) 0:00:03.407 ********** 2026-04-20 00:53:49.548396 | orchestrator | changed: [testbed-manager] 2026-04-20 00:53:49.548400 | orchestrator | 2026-04-20 00:53:49.548404 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2026-04-20 00:53:49.548407 | orchestrator | Monday 20 April 2026 00:51:41 +0000 (0:00:00.555) 0:00:03.962 ********** 2026-04-20 00:53:49.548411 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-20 00:53:49.548416 | orchestrator | 2026-04-20 00:53:49.548420 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2026-04-20 00:53:49.548444 | orchestrator | Monday 20 April 2026 00:51:43 +0000 (0:00:01.632) 0:00:05.595 ********** 2026-04-20 00:53:49.548448 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-20 00:53:49.548452 | orchestrator | 2026-04-20 00:53:49.548456 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2026-04-20 00:53:49.548506 | orchestrator | Monday 20 April 2026 00:51:44 +0000 (0:00:00.788) 0:00:06.384 ********** 2026-04-20 00:53:49.548510 | orchestrator | ok: [testbed-manager] 2026-04-20 00:53:49.548514 | orchestrator | 2026-04-20 00:53:49.548518 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2026-04-20 00:53:49.548533 | orchestrator | Monday 20 April 2026 00:51:44 +0000 (0:00:00.375) 0:00:06.759 ********** 2026-04-20 00:53:49.548537 | orchestrator | ok: [testbed-manager] 2026-04-20 00:53:49.548541 | orchestrator | 2026-04-20 00:53:49.548545 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:53:49.548549 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:53:49.548553 | orchestrator | 2026-04-20 00:53:49.548557 | orchestrator | 2026-04-20 00:53:49.548560 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:53:49.548564 | orchestrator | Monday 20 April 2026 00:51:44 +0000 (0:00:00.300) 0:00:07.059 ********** 2026-04-20 00:53:49.548568 | orchestrator | =============================================================================== 2026-04-20 00:53:49.548572 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.63s 2026-04-20 00:53:49.548576 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.14s 2026-04-20 00:53:49.548580 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.91s 2026-04-20 00:53:49.548597 | orchestrator | Change server address in the kubeconfig inside the manager service ------ 0.79s 2026-04-20 00:53:49.548601 | orchestrator | Get home directory of operator user ------------------------------------- 0.71s 2026-04-20 00:53:49.548605 | orchestrator | Change server address in the kubeconfig --------------------------------- 0.56s 2026-04-20 00:53:49.548609 | orchestrator | Create .kube directory -------------------------------------------------- 0.46s 2026-04-20 00:53:49.548614 | orchestrator | Set KUBECONFIG environment variable ------------------------------------- 0.38s 2026-04-20 00:53:49.548618 | orchestrator | Enable kubectl command line completion ---------------------------------- 0.30s 2026-04-20 00:53:49.548828 | orchestrator | 2026-04-20 00:53:49.548838 | orchestrator | 2026-04-20 00:53:49.548843 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:53:49.548855 | orchestrator | 2026-04-20 00:53:49.548859 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:53:49.548864 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.326) 0:00:00.326 ********** 2026-04-20 00:53:49.548868 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.548873 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.548877 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.548881 | orchestrator | 2026-04-20 00:53:49.548886 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:53:49.548890 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:00.286) 0:00:00.612 ********** 2026-04-20 00:53:49.548894 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2026-04-20 00:53:49.548899 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2026-04-20 00:53:49.548903 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2026-04-20 00:53:49.548907 | orchestrator | 2026-04-20 00:53:49.548912 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2026-04-20 00:53:49.548916 | orchestrator | 2026-04-20 00:53:49.548920 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2026-04-20 00:53:49.548925 | orchestrator | Monday 20 April 2026 00:48:23 +0000 (0:00:00.467) 0:00:01.080 ********** 2026-04-20 00:53:49.548929 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.548934 | orchestrator | 2026-04-20 00:53:49.548938 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2026-04-20 00:53:49.548943 | orchestrator | Monday 20 April 2026 00:48:24 +0000 (0:00:00.832) 0:00:01.912 ********** 2026-04-20 00:53:49.548948 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.548952 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.548957 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.548961 | orchestrator | 2026-04-20 00:53:49.548966 | orchestrator | TASK [Setting sysctl values] *************************************************** 2026-04-20 00:53:49.548970 | orchestrator | Monday 20 April 2026 00:48:25 +0000 (0:00:01.207) 0:00:03.119 ********** 2026-04-20 00:53:49.548974 | orchestrator | included: sysctl for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.548979 | orchestrator | 2026-04-20 00:53:49.548983 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2026-04-20 00:53:49.548987 | orchestrator | Monday 20 April 2026 00:48:26 +0000 (0:00:01.292) 0:00:04.412 ********** 2026-04-20 00:53:49.548992 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.548996 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.549000 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.549005 | orchestrator | 2026-04-20 00:53:49.549009 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2026-04-20 00:53:49.549014 | orchestrator | Monday 20 April 2026 00:48:27 +0000 (0:00:01.214) 0:00:05.626 ********** 2026-04-20 00:53:49.549018 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-04-20 00:53:49.549022 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-04-20 00:53:49.549025 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-04-20 00:53:49.549029 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-04-20 00:53:49.549033 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-04-20 00:53:49.549037 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-04-20 00:53:49.549041 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2026-04-20 00:53:49.549045 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-04-20 00:53:49.549054 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2026-04-20 00:53:49.549062 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2026-04-20 00:53:49.549066 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-04-20 00:53:49.549070 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2026-04-20 00:53:49.549073 | orchestrator | 2026-04-20 00:53:49.549077 | orchestrator | TASK [module-load : Load modules] ********************************************** 2026-04-20 00:53:49.549081 | orchestrator | Monday 20 April 2026 00:48:32 +0000 (0:00:04.666) 0:00:10.292 ********** 2026-04-20 00:53:49.549085 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2026-04-20 00:53:49.549089 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2026-04-20 00:53:49.549093 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2026-04-20 00:53:49.549096 | orchestrator | 2026-04-20 00:53:49.549100 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2026-04-20 00:53:49.549104 | orchestrator | Monday 20 April 2026 00:48:33 +0000 (0:00:01.032) 0:00:11.325 ********** 2026-04-20 00:53:49.549149 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2026-04-20 00:53:49.549154 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2026-04-20 00:53:49.549158 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2026-04-20 00:53:49.549161 | orchestrator | 2026-04-20 00:53:49.549165 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2026-04-20 00:53:49.549169 | orchestrator | Monday 20 April 2026 00:48:35 +0000 (0:00:01.612) 0:00:12.938 ********** 2026-04-20 00:53:49.549173 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2026-04-20 00:53:49.549177 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.549184 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2026-04-20 00:53:49.549188 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.549192 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2026-04-20 00:53:49.549196 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.549199 | orchestrator | 2026-04-20 00:53:49.549203 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2026-04-20 00:53:49.549207 | orchestrator | Monday 20 April 2026 00:48:35 +0000 (0:00:00.655) 0:00:13.593 ********** 2026-04-20 00:53:49.549213 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549221 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549225 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549240 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549244 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549373 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549380 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.549385 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.549389 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.549393 | orchestrator | 2026-04-20 00:53:49.549401 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2026-04-20 00:53:49.549440 | orchestrator | Monday 20 April 2026 00:48:38 +0000 (0:00:02.189) 0:00:15.783 ********** 2026-04-20 00:53:49.549447 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.549452 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.549458 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.549463 | orchestrator | 2026-04-20 00:53:49.549469 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2026-04-20 00:53:49.549475 | orchestrator | Monday 20 April 2026 00:48:39 +0000 (0:00:01.255) 0:00:17.038 ********** 2026-04-20 00:53:49.549481 | orchestrator | changed: [testbed-node-0] => (item=users) 2026-04-20 00:53:49.549488 | orchestrator | changed: [testbed-node-1] => (item=users) 2026-04-20 00:53:49.549492 | orchestrator | changed: [testbed-node-2] => (item=users) 2026-04-20 00:53:49.549496 | orchestrator | changed: [testbed-node-1] => (item=rules) 2026-04-20 00:53:49.549500 | orchestrator | changed: [testbed-node-0] => (item=rules) 2026-04-20 00:53:49.549503 | orchestrator | changed: [testbed-node-2] => (item=rules) 2026-04-20 00:53:49.549507 | orchestrator | 2026-04-20 00:53:49.549511 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2026-04-20 00:53:49.549514 | orchestrator | Monday 20 April 2026 00:48:41 +0000 (0:00:02.593) 0:00:19.632 ********** 2026-04-20 00:53:49.549518 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.549522 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.549526 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.549529 | orchestrator | 2026-04-20 00:53:49.549537 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2026-04-20 00:53:49.549540 | orchestrator | Monday 20 April 2026 00:48:44 +0000 (0:00:02.538) 0:00:22.171 ********** 2026-04-20 00:53:49.549544 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.549548 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.549552 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.549556 | orchestrator | 2026-04-20 00:53:49.549559 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2026-04-20 00:53:49.549563 | orchestrator | Monday 20 April 2026 00:48:46 +0000 (0:00:01.978) 0:00:24.149 ********** 2026-04-20 00:53:49.549567 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.549578 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.549582 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.549593 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-20 00:53:49.549597 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.549601 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.549608 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.549612 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.549653 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-20 00:53:49.549659 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.549663 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.549671 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.549675 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.549679 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-20 00:53:49.549686 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.549690 | orchestrator | 2026-04-20 00:53:49.549694 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2026-04-20 00:53:49.549698 | orchestrator | Monday 20 April 2026 00:48:47 +0000 (0:00:00.873) 0:00:25.023 ********** 2026-04-20 00:53:49.549702 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549712 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549716 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549723 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549727 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.549731 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-20 00:53:49.549742 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549746 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549753 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.549760 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.549764 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-20 00:53:49.549769 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy-ssh:9.6.20260328', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744', '__omit_place_holder__ab8f41aa58a829bc6329c21815395200072e2744'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2026-04-20 00:53:49.549773 | orchestrator | 2026-04-20 00:53:49.549776 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2026-04-20 00:53:49.549780 | orchestrator | Monday 20 April 2026 00:48:51 +0000 (0:00:04.189) 0:00:29.212 ********** 2026-04-20 00:53:49.549826 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549834 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549843 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549852 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549858 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549868 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.549880 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.549886 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.549892 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.549903 | orchestrator | 2026-04-20 00:53:49.549913 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2026-04-20 00:53:49.549919 | orchestrator | Monday 20 April 2026 00:48:55 +0000 (0:00:03.879) 0:00:33.092 ********** 2026-04-20 00:53:49.549925 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-04-20 00:53:49.549932 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-04-20 00:53:49.549937 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2026-04-20 00:53:49.549943 | orchestrator | 2026-04-20 00:53:49.549949 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2026-04-20 00:53:49.549993 | orchestrator | Monday 20 April 2026 00:48:57 +0000 (0:00:01.856) 0:00:34.949 ********** 2026-04-20 00:53:49.549999 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-04-20 00:53:49.550003 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-04-20 00:53:49.550007 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2026-04-20 00:53:49.550681 | orchestrator | 2026-04-20 00:53:49.550704 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2026-04-20 00:53:49.550711 | orchestrator | Monday 20 April 2026 00:49:00 +0000 (0:00:03.647) 0:00:38.596 ********** 2026-04-20 00:53:49.550718 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.550726 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.550732 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.550739 | orchestrator | 2026-04-20 00:53:49.550746 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2026-04-20 00:53:49.550751 | orchestrator | Monday 20 April 2026 00:49:01 +0000 (0:00:00.509) 0:00:39.106 ********** 2026-04-20 00:53:49.550755 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-04-20 00:53:49.550762 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-04-20 00:53:49.550766 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2026-04-20 00:53:49.550770 | orchestrator | 2026-04-20 00:53:49.550773 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2026-04-20 00:53:49.550777 | orchestrator | Monday 20 April 2026 00:49:03 +0000 (0:00:02.315) 0:00:41.422 ********** 2026-04-20 00:53:49.550781 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-04-20 00:53:49.550785 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-04-20 00:53:49.550789 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2026-04-20 00:53:49.550793 | orchestrator | 2026-04-20 00:53:49.550797 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2026-04-20 00:53:49.550800 | orchestrator | Monday 20 April 2026 00:49:05 +0000 (0:00:01.915) 0:00:43.337 ********** 2026-04-20 00:53:49.550804 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.550808 | orchestrator | 2026-04-20 00:53:49.550812 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2026-04-20 00:53:49.550816 | orchestrator | Monday 20 April 2026 00:49:06 +0000 (0:00:00.559) 0:00:43.896 ********** 2026-04-20 00:53:49.550834 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2026-04-20 00:53:49.550839 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2026-04-20 00:53:49.550842 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2026-04-20 00:53:49.550846 | orchestrator | 2026-04-20 00:53:49.550850 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2026-04-20 00:53:49.550853 | orchestrator | Monday 20 April 2026 00:49:07 +0000 (0:00:01.774) 0:00:45.671 ********** 2026-04-20 00:53:49.550857 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2026-04-20 00:53:49.550861 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2026-04-20 00:53:49.550865 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2026-04-20 00:53:49.550868 | orchestrator | 2026-04-20 00:53:49.550872 | orchestrator | TASK [loadbalancer : Copying over proxysql-cert.pem] *************************** 2026-04-20 00:53:49.550878 | orchestrator | Monday 20 April 2026 00:49:09 +0000 (0:00:01.692) 0:00:47.363 ********** 2026-04-20 00:53:49.550884 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.550890 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.550895 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.550901 | orchestrator | 2026-04-20 00:53:49.550907 | orchestrator | TASK [loadbalancer : Copying over proxysql-key.pem] **************************** 2026-04-20 00:53:49.550912 | orchestrator | Monday 20 April 2026 00:49:09 +0000 (0:00:00.253) 0:00:47.617 ********** 2026-04-20 00:53:49.550918 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.550924 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.550930 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.550936 | orchestrator | 2026-04-20 00:53:49.550943 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2026-04-20 00:53:49.550949 | orchestrator | Monday 20 April 2026 00:49:10 +0000 (0:00:00.267) 0:00:47.885 ********** 2026-04-20 00:53:49.550965 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.550973 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.550977 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.550981 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.550993 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.551051 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.551271 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.551286 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.551290 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.551294 | orchestrator | 2026-04-20 00:53:49.551298 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2026-04-20 00:53:49.551302 | orchestrator | Monday 20 April 2026 00:49:13 +0000 (0:00:03.681) 0:00:51.566 ********** 2026-04-20 00:53:49.551306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.551321 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.551325 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.551329 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.551333 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.551360 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.551365 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.551376 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.551380 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.551387 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.551394 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.551398 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.551402 | orchestrator | 2026-04-20 00:53:49.551405 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2026-04-20 00:53:49.551409 | orchestrator | Monday 20 April 2026 00:49:14 +0000 (0:00:00.651) 0:00:52.218 ********** 2026-04-20 00:53:49.551413 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.551526 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.551534 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.551538 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.551573 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.551606 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.551614 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.551830 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.551836 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.551854 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.551858 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.551863 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.551867 | orchestrator | 2026-04-20 00:53:49.551871 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2026-04-20 00:53:49.551875 | orchestrator | Monday 20 April 2026 00:49:15 +0000 (0:00:00.803) 0:00:53.021 ********** 2026-04-20 00:53:49.551885 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-04-20 00:53:49.551890 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-04-20 00:53:49.551894 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2026-04-20 00:53:49.551897 | orchestrator | 2026-04-20 00:53:49.551901 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2026-04-20 00:53:49.551905 | orchestrator | Monday 20 April 2026 00:49:16 +0000 (0:00:01.570) 0:00:54.592 ********** 2026-04-20 00:53:49.551909 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-04-20 00:53:49.551913 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-04-20 00:53:49.551917 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2026-04-20 00:53:49.551920 | orchestrator | 2026-04-20 00:53:49.551924 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2026-04-20 00:53:49.551928 | orchestrator | Monday 20 April 2026 00:49:18 +0000 (0:00:01.658) 0:00:56.250 ********** 2026-04-20 00:53:49.551932 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-04-20 00:53:49.551936 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-04-20 00:53:49.551939 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2026-04-20 00:53:49.551943 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-04-20 00:53:49.551947 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.551951 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-04-20 00:53:49.551954 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.551958 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2026-04-20 00:53:49.551962 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.551968 | orchestrator | 2026-04-20 00:53:49.551975 | orchestrator | TASK [service-check-containers : loadbalancer | Check containers] ************** 2026-04-20 00:53:49.551985 | orchestrator | Monday 20 April 2026 00:49:19 +0000 (0:00:00.837) 0:00:57.087 ********** 2026-04-20 00:53:49.551992 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.552012 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.552019 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.552029 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.552036 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.552042 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.552051 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.552058 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.552289 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.552308 | orchestrator | 2026-04-20 00:53:49.552312 | orchestrator | TASK [service-check-containers : loadbalancer | Notify handlers to restart containers] *** 2026-04-20 00:53:49.552316 | orchestrator | Monday 20 April 2026 00:49:21 +0000 (0:00:02.514) 0:00:59.601 ********** 2026-04-20 00:53:49.552320 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:53:49.552324 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:53:49.552328 | orchestrator | } 2026-04-20 00:53:49.552332 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:53:49.552336 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:53:49.552339 | orchestrator | } 2026-04-20 00:53:49.552343 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:53:49.552347 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:53:49.552351 | orchestrator | } 2026-04-20 00:53:49.552355 | orchestrator | 2026-04-20 00:53:49.552358 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:53:49.552362 | orchestrator | Monday 20 April 2026 00:49:22 +0000 (0:00:00.340) 0:00:59.942 ********** 2026-04-20 00:53:49.552366 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.552371 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.552375 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.552379 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.552383 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.552387 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.552406 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.552410 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.552414 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.552418 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.552488 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.552498 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.552504 | orchestrator | 2026-04-20 00:53:49.552510 | orchestrator | TASK [include_role : aodh] ***************************************************** 2026-04-20 00:53:49.552516 | orchestrator | Monday 20 April 2026 00:49:23 +0000 (0:00:01.464) 0:01:01.407 ********** 2026-04-20 00:53:49.552522 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.552528 | orchestrator | 2026-04-20 00:53:49.552534 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2026-04-20 00:53:49.552540 | orchestrator | Monday 20 April 2026 00:49:24 +0000 (0:00:00.811) 0:01:02.219 ********** 2026-04-20 00:53:49.552615 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.552929 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.552944 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.552950 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.552977 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.552986 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.552996 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553013 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553017 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.553044 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.553049 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553055 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553064 | orchestrator | 2026-04-20 00:53:49.553068 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2026-04-20 00:53:49.553072 | orchestrator | Monday 20 April 2026 00:49:28 +0000 (0:00:03.545) 0:01:05.764 ********** 2026-04-20 00:53:49.553142 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.553154 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.553160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553166 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553171 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.553181 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.553193 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.553266 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553274 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553279 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.553285 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-api:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.553291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-evaluator:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.553304 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-listener:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553310 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/aodh-notifier:20.0.0.20260328', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553317 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.553322 | orchestrator | 2026-04-20 00:53:49.553328 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2026-04-20 00:53:49.553335 | orchestrator | Monday 20 April 2026 00:49:28 +0000 (0:00:00.680) 0:01:06.444 ********** 2026-04-20 00:53:49.553343 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.553365 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.553374 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.553411 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.553419 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.553477 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.553484 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.553490 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.553496 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.553502 | orchestrator | 2026-04-20 00:53:49.553508 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2026-04-20 00:53:49.553514 | orchestrator | Monday 20 April 2026 00:49:29 +0000 (0:00:01.128) 0:01:07.573 ********** 2026-04-20 00:53:49.553520 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.553525 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.553531 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.553537 | orchestrator | 2026-04-20 00:53:49.553543 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2026-04-20 00:53:49.553557 | orchestrator | Monday 20 April 2026 00:49:31 +0000 (0:00:01.265) 0:01:08.839 ********** 2026-04-20 00:53:49.553563 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.553570 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.553576 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.553582 | orchestrator | 2026-04-20 00:53:49.553589 | orchestrator | TASK [include_role : barbican] ************************************************* 2026-04-20 00:53:49.553595 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:02.168) 0:01:11.008 ********** 2026-04-20 00:53:49.553601 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.553608 | orchestrator | 2026-04-20 00:53:49.553614 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2026-04-20 00:53:49.553620 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:00.631) 0:01:11.639 ********** 2026-04-20 00:53:49.553637 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.553646 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553978 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.553990 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.554005 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554047 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554054 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.554122 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554129 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554138 | orchestrator | 2026-04-20 00:53:49.554142 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2026-04-20 00:53:49.554147 | orchestrator | Monday 20 April 2026 00:49:38 +0000 (0:00:04.166) 0:01:15.806 ********** 2026-04-20 00:53:49.554151 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.554160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.554164 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554184 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554473 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554498 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.554503 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554507 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.554515 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-api:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.554520 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-keystone-listener:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554560 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/barbican-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.554565 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.554569 | orchestrator | 2026-04-20 00:53:49.554573 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2026-04-20 00:53:49.554577 | orchestrator | Monday 20 April 2026 00:49:39 +0000 (0:00:01.001) 0:01:16.807 ********** 2026-04-20 00:53:49.554586 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.554592 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.554597 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.554601 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.554605 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.554609 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.554613 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.554616 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.554621 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.554624 | orchestrator | 2026-04-20 00:53:49.554628 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2026-04-20 00:53:49.554632 | orchestrator | Monday 20 April 2026 00:49:39 +0000 (0:00:00.750) 0:01:17.558 ********** 2026-04-20 00:53:49.554636 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.554640 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.554643 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.554647 | orchestrator | 2026-04-20 00:53:49.554651 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2026-04-20 00:53:49.554655 | orchestrator | Monday 20 April 2026 00:49:41 +0000 (0:00:01.205) 0:01:18.763 ********** 2026-04-20 00:53:49.554658 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.554662 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.554666 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.554670 | orchestrator | 2026-04-20 00:53:49.554678 | orchestrator | TASK [include_role : blazar] *************************************************** 2026-04-20 00:53:49.554684 | orchestrator | Monday 20 April 2026 00:49:42 +0000 (0:00:01.841) 0:01:20.605 ********** 2026-04-20 00:53:49.554690 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.554696 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.554705 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.554712 | orchestrator | 2026-04-20 00:53:49.554719 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2026-04-20 00:53:49.554725 | orchestrator | Monday 20 April 2026 00:49:43 +0000 (0:00:00.262) 0:01:20.868 ********** 2026-04-20 00:53:49.554787 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.554792 | orchestrator | 2026-04-20 00:53:49.554796 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2026-04-20 00:53:49.554799 | orchestrator | Monday 20 April 2026 00:49:43 +0000 (0:00:00.769) 0:01:21.637 ********** 2026-04-20 00:53:49.554842 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-04-20 00:53:49.554861 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-04-20 00:53:49.554868 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}}) 2026-04-20 00:53:49.554874 | orchestrator | 2026-04-20 00:53:49.554880 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2026-04-20 00:53:49.554886 | orchestrator | Monday 20 April 2026 00:49:46 +0000 (0:00:02.385) 0:01:24.023 ********** 2026-04-20 00:53:49.555219 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-04-20 00:53:49.555240 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.555251 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-04-20 00:53:49.555269 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.555325 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}}}})  2026-04-20 00:53:49.555334 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.555339 | orchestrator | 2026-04-20 00:53:49.555345 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2026-04-20 00:53:49.555351 | orchestrator | Monday 20 April 2026 00:49:47 +0000 (0:00:01.415) 0:01:25.438 ********** 2026-04-20 00:53:49.555640 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-20 00:53:49.555692 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-20 00:53:49.555732 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.555740 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-20 00:53:49.555746 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-20 00:53:49.555752 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.555764 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-20 00:53:49.555771 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:7480 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:7480 check inter 2000 rise 2 fall 5']}})  2026-04-20 00:53:49.555788 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.555794 | orchestrator | 2026-04-20 00:53:49.555801 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2026-04-20 00:53:49.555807 | orchestrator | Monday 20 April 2026 00:49:50 +0000 (0:00:02.469) 0:01:27.907 ********** 2026-04-20 00:53:49.555813 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.555820 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.555825 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.555831 | orchestrator | 2026-04-20 00:53:49.555837 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2026-04-20 00:53:49.555842 | orchestrator | Monday 20 April 2026 00:49:50 +0000 (0:00:00.536) 0:01:28.444 ********** 2026-04-20 00:53:49.555848 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.555854 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.555861 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.555865 | orchestrator | 2026-04-20 00:53:49.555869 | orchestrator | TASK [include_role : cinder] *************************************************** 2026-04-20 00:53:49.555873 | orchestrator | Monday 20 April 2026 00:49:52 +0000 (0:00:01.299) 0:01:29.744 ********** 2026-04-20 00:53:49.555921 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.555927 | orchestrator | 2026-04-20 00:53:49.555931 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2026-04-20 00:53:49.555935 | orchestrator | Monday 20 April 2026 00:49:52 +0000 (0:00:00.901) 0:01:30.645 ********** 2026-04-20 00:53:49.555940 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.555946 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.555951 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.555970 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556004 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.556010 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556014 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556019 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556073 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.556080 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556110 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556115 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556119 | orchestrator | 2026-04-20 00:53:49.556123 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2026-04-20 00:53:49.556128 | orchestrator | Monday 20 April 2026 00:49:56 +0000 (0:00:03.941) 0:01:34.587 ********** 2026-04-20 00:53:49.556132 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.556352 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556415 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.556439 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556446 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556454 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556468 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.556476 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556481 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556484 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.556513 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-api:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'wsgi': 'cinder.wsgi.api:application', 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.556518 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-scheduler:26.2.1.20260328', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.556617 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-volume:26.2.1.20260328', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557282 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/cinder-backup:26.2.1.20260328', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557301 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.557305 | orchestrator | 2026-04-20 00:53:49.557310 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2026-04-20 00:53:49.557314 | orchestrator | Monday 20 April 2026 00:49:57 +0000 (0:00:00.791) 0:01:35.378 ********** 2026-04-20 00:53:49.557319 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.557324 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.557329 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.557333 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.557388 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.557394 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.557398 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.557402 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.557406 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.557409 | orchestrator | 2026-04-20 00:53:49.557413 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2026-04-20 00:53:49.557417 | orchestrator | Monday 20 April 2026 00:49:58 +0000 (0:00:01.005) 0:01:36.383 ********** 2026-04-20 00:53:49.557439 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.557444 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.557453 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.557457 | orchestrator | 2026-04-20 00:53:49.557461 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2026-04-20 00:53:49.557465 | orchestrator | Monday 20 April 2026 00:49:59 +0000 (0:00:01.188) 0:01:37.572 ********** 2026-04-20 00:53:49.557468 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.557472 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.557476 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.557480 | orchestrator | 2026-04-20 00:53:49.557483 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2026-04-20 00:53:49.557487 | orchestrator | Monday 20 April 2026 00:50:01 +0000 (0:00:02.082) 0:01:39.655 ********** 2026-04-20 00:53:49.557491 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.557494 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.557498 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.557502 | orchestrator | 2026-04-20 00:53:49.557505 | orchestrator | TASK [include_role : cyborg] *************************************************** 2026-04-20 00:53:49.557509 | orchestrator | Monday 20 April 2026 00:50:02 +0000 (0:00:00.311) 0:01:39.966 ********** 2026-04-20 00:53:49.557513 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.557517 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.557520 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.557524 | orchestrator | 2026-04-20 00:53:49.557528 | orchestrator | TASK [include_role : designate] ************************************************ 2026-04-20 00:53:49.557532 | orchestrator | Monday 20 April 2026 00:50:02 +0000 (0:00:00.333) 0:01:40.299 ********** 2026-04-20 00:53:49.557535 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.557539 | orchestrator | 2026-04-20 00:53:49.557543 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2026-04-20 00:53:49.557547 | orchestrator | Monday 20 April 2026 00:50:03 +0000 (0:00:00.879) 0:01:41.178 ********** 2026-04-20 00:53:49.557554 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.557559 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-20 00:53:49.557601 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557611 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557615 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.557620 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557625 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-20 00:53:49.557630 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557652 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557661 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557668 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557675 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557685 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.557692 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558219 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.558249 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-20 00:53:49.558254 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558258 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558263 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558267 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558319 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558325 | orchestrator | 2026-04-20 00:53:49.558330 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2026-04-20 00:53:49.558334 | orchestrator | Monday 20 April 2026 00:50:08 +0000 (0:00:04.823) 0:01:46.002 ********** 2026-04-20 00:53:49.558352 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.558357 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-20 00:53:49.558362 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558367 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558373 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558404 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558409 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558413 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.558418 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.558471 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-20 00:53:49.558476 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558512 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558518 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558522 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558526 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558530 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.558536 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-api:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.558545 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-backend-bind9:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2026-04-20 00:53:49.558575 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-central:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558581 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-mdns:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558585 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-producer:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558589 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-worker:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558595 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/designate-sink:20.0.1.20260328', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.558599 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.558603 | orchestrator | 2026-04-20 00:53:49.558611 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2026-04-20 00:53:49.558615 | orchestrator | Monday 20 April 2026 00:50:09 +0000 (0:00:00.946) 0:01:46.949 ********** 2026-04-20 00:53:49.558619 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.558626 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.558630 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.558635 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.558639 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.558643 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.558673 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.558681 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.558687 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.558693 | orchestrator | 2026-04-20 00:53:49.558699 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2026-04-20 00:53:49.558704 | orchestrator | Monday 20 April 2026 00:50:10 +0000 (0:00:01.577) 0:01:48.527 ********** 2026-04-20 00:53:49.558710 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.558716 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.558722 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.558728 | orchestrator | 2026-04-20 00:53:49.558733 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2026-04-20 00:53:49.558739 | orchestrator | Monday 20 April 2026 00:50:11 +0000 (0:00:01.157) 0:01:49.684 ********** 2026-04-20 00:53:49.558745 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.558751 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.558757 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.558762 | orchestrator | 2026-04-20 00:53:49.558766 | orchestrator | TASK [include_role : etcd] ***************************************************** 2026-04-20 00:53:49.558769 | orchestrator | Monday 20 April 2026 00:50:13 +0000 (0:00:01.927) 0:01:51.611 ********** 2026-04-20 00:53:49.558773 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.558777 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.558781 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.558785 | orchestrator | 2026-04-20 00:53:49.558788 | orchestrator | TASK [include_role : glance] *************************************************** 2026-04-20 00:53:49.559288 | orchestrator | Monday 20 April 2026 00:50:14 +0000 (0:00:00.251) 0:01:51.863 ********** 2026-04-20 00:53:49.559362 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.559369 | orchestrator | 2026-04-20 00:53:49.559373 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2026-04-20 00:53:49.559377 | orchestrator | Monday 20 April 2026 00:50:14 +0000 (0:00:00.710) 0:01:52.573 ********** 2026-04-20 00:53:49.559385 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-04-20 00:53:49.559970 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.560028 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-04-20 00:53:49.560094 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.560102 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2026-04-20 00:53:49.560140 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.560146 | orchestrator | 2026-04-20 00:53:49.560150 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2026-04-20 00:53:49.560155 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:04.965) 0:01:57.539 ********** 2026-04-20 00:53:49.560159 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-04-20 00:53:49.560194 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.560200 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.560205 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-04-20 00:53:49.560240 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.560246 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.560251 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/glance-api:30.1.1.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2026-04-20 00:53:49.560260 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/glance-tls-proxy:30.1.1.20260328', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.560265 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.560269 | orchestrator | 2026-04-20 00:53:49.560272 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2026-04-20 00:53:49.560276 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:03.475) 0:02:01.014 ********** 2026-04-20 00:53:49.560330 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-20 00:53:49.560337 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-20 00:53:49.560341 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.560345 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-20 00:53:49.560369 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-20 00:53:49.560373 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.560377 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-20 00:53:49.560384 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h', 'option httpchk'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2026-04-20 00:53:49.560388 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.560391 | orchestrator | 2026-04-20 00:53:49.560395 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2026-04-20 00:53:49.560399 | orchestrator | Monday 20 April 2026 00:50:26 +0000 (0:00:03.387) 0:02:04.402 ********** 2026-04-20 00:53:49.560403 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.560407 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.560410 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.560414 | orchestrator | 2026-04-20 00:53:49.560418 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2026-04-20 00:53:49.560441 | orchestrator | Monday 20 April 2026 00:50:28 +0000 (0:00:01.464) 0:02:05.867 ********** 2026-04-20 00:53:49.560448 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.560452 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.560456 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.560459 | orchestrator | 2026-04-20 00:53:49.560463 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2026-04-20 00:53:49.560467 | orchestrator | Monday 20 April 2026 00:50:30 +0000 (0:00:01.920) 0:02:07.788 ********** 2026-04-20 00:53:49.560471 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.560474 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.560478 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.560482 | orchestrator | 2026-04-20 00:53:49.561376 | orchestrator | TASK [include_role : grafana] ************************************************** 2026-04-20 00:53:49.561412 | orchestrator | Monday 20 April 2026 00:50:30 +0000 (0:00:00.261) 0:02:08.050 ********** 2026-04-20 00:53:49.561418 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.561479 | orchestrator | 2026-04-20 00:53:49.561486 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2026-04-20 00:53:49.561501 | orchestrator | Monday 20 April 2026 00:50:31 +0000 (0:00:00.833) 0:02:08.884 ********** 2026-04-20 00:53:49.561510 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.561519 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.561525 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.561531 | orchestrator | 2026-04-20 00:53:49.561542 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2026-04-20 00:53:49.561550 | orchestrator | Monday 20 April 2026 00:50:34 +0000 (0:00:03.151) 0:02:12.035 ********** 2026-04-20 00:53:49.561554 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.561643 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.561661 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.561667 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.561674 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.561680 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.561686 | orchestrator | 2026-04-20 00:53:49.561692 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2026-04-20 00:53:49.561698 | orchestrator | Monday 20 April 2026 00:50:34 +0000 (0:00:00.359) 0:02:12.394 ********** 2026-04-20 00:53:49.561705 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.561713 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.561721 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.561728 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.561734 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.561740 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.561746 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.561756 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.561762 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.561767 | orchestrator | 2026-04-20 00:53:49.561774 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2026-04-20 00:53:49.561780 | orchestrator | Monday 20 April 2026 00:50:35 +0000 (0:00:00.589) 0:02:12.984 ********** 2026-04-20 00:53:49.561785 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.561791 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.561797 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.561802 | orchestrator | 2026-04-20 00:53:49.561808 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2026-04-20 00:53:49.561821 | orchestrator | Monday 20 April 2026 00:50:36 +0000 (0:00:01.220) 0:02:14.205 ********** 2026-04-20 00:53:49.561827 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.561832 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.561838 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.561844 | orchestrator | 2026-04-20 00:53:49.561850 | orchestrator | TASK [include_role : heat] ***************************************************** 2026-04-20 00:53:49.561857 | orchestrator | Monday 20 April 2026 00:50:38 +0000 (0:00:01.632) 0:02:15.838 ********** 2026-04-20 00:53:49.561863 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.561869 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.561875 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.561882 | orchestrator | 2026-04-20 00:53:49.561888 | orchestrator | TASK [include_role : horizon] ************************************************** 2026-04-20 00:53:49.561894 | orchestrator | Monday 20 April 2026 00:50:38 +0000 (0:00:00.381) 0:02:16.219 ********** 2026-04-20 00:53:49.561900 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.561906 | orchestrator | 2026-04-20 00:53:49.561968 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2026-04-20 00:53:49.561974 | orchestrator | Monday 20 April 2026 00:50:39 +0000 (0:00:00.797) 0:02:17.017 ********** 2026-04-20 00:53:49.561979 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:53:49.562010 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:53:49.562045 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:53:49.562050 | orchestrator | 2026-04-20 00:53:49.562059 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2026-04-20 00:53:49.562063 | orchestrator | Monday 20 April 2026 00:50:42 +0000 (0:00:03.380) 0:02:20.398 ********** 2026-04-20 00:53:49.562089 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:53:49.562095 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.562101 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:53:49.562108 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.562144 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:53:49.562150 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.562154 | orchestrator | 2026-04-20 00:53:49.562158 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2026-04-20 00:53:49.562162 | orchestrator | Monday 20 April 2026 00:50:43 +0000 (0:00:00.904) 0:02:21.303 ********** 2026-04-20 00:53:49.562167 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-20 00:53:49.562172 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-20 00:53:49.562182 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-20 00:53:49.562190 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-20 00:53:49.562195 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-04-20 00:53:49.562199 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-20 00:53:49.562203 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.562207 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-20 00:53:49.562238 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-20 00:53:49.562244 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-20 00:53:49.562252 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-20 00:53:49.562260 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}})  2026-04-20 00:53:49.562268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-20 00:53:49.562278 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2026-04-20 00:53:49.562285 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-04-20 00:53:49.562297 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2026-04-20 00:53:49.562303 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.562309 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.562333 | orchestrator | 2026-04-20 00:53:49.562340 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2026-04-20 00:53:49.562345 | orchestrator | Monday 20 April 2026 00:50:44 +0000 (0:00:00.940) 0:02:22.244 ********** 2026-04-20 00:53:49.562352 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.562357 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.562370 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.562377 | orchestrator | 2026-04-20 00:53:49.562383 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2026-04-20 00:53:49.562395 | orchestrator | Monday 20 April 2026 00:50:45 +0000 (0:00:01.257) 0:02:23.501 ********** 2026-04-20 00:53:49.562401 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.562407 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.562413 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.562419 | orchestrator | 2026-04-20 00:53:49.562443 | orchestrator | TASK [include_role : influxdb] ************************************************* 2026-04-20 00:53:49.562447 | orchestrator | Monday 20 April 2026 00:50:47 +0000 (0:00:02.038) 0:02:25.540 ********** 2026-04-20 00:53:49.562451 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.562455 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.562458 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.562462 | orchestrator | 2026-04-20 00:53:49.562466 | orchestrator | TASK [include_role : ironic] *************************************************** 2026-04-20 00:53:49.562470 | orchestrator | Monday 20 April 2026 00:50:48 +0000 (0:00:00.541) 0:02:26.082 ********** 2026-04-20 00:53:49.562480 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.562484 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.562488 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.562492 | orchestrator | 2026-04-20 00:53:49.562495 | orchestrator | TASK [include_role : keystone] ************************************************* 2026-04-20 00:53:49.562499 | orchestrator | Monday 20 April 2026 00:50:48 +0000 (0:00:00.297) 0:02:26.379 ********** 2026-04-20 00:53:49.562503 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.562507 | orchestrator | 2026-04-20 00:53:49.562510 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2026-04-20 00:53:49.562514 | orchestrator | Monday 20 April 2026 00:50:49 +0000 (0:00:00.917) 0:02:27.296 ********** 2026-04-20 00:53:49.562564 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:53:49.562571 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:53:49.562581 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:53:49.562588 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:53:49.562593 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:53:49.562634 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:53:49.562644 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:53:49.562655 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:53:49.562662 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:53:49.562667 | orchestrator | 2026-04-20 00:53:49.562676 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2026-04-20 00:53:49.562682 | orchestrator | Monday 20 April 2026 00:50:53 +0000 (0:00:04.095) 0:02:31.391 ********** 2026-04-20 00:53:49.562689 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:53:49.562737 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:53:49.562745 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:53:49.562757 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.562763 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:53:49.562772 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:53:49.562779 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:53:49.562785 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.562837 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:53:49.562853 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:53:49.562859 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:53:49.562865 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.562871 | orchestrator | 2026-04-20 00:53:49.562877 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2026-04-20 00:53:49.562883 | orchestrator | Monday 20 April 2026 00:50:54 +0000 (0:00:00.660) 0:02:32.051 ********** 2026-04-20 00:53:49.562890 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-20 00:53:49.562898 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-20 00:53:49.562906 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.562915 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-20 00:53:49.562921 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-20 00:53:49.562927 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.562933 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-20 00:53:49.562939 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}})  2026-04-20 00:53:49.562945 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.562950 | orchestrator | 2026-04-20 00:53:49.562956 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2026-04-20 00:53:49.563036 | orchestrator | Monday 20 April 2026 00:50:55 +0000 (0:00:00.850) 0:02:32.902 ********** 2026-04-20 00:53:49.563043 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.563047 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.563051 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.563054 | orchestrator | 2026-04-20 00:53:49.563058 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2026-04-20 00:53:49.563062 | orchestrator | Monday 20 April 2026 00:50:56 +0000 (0:00:01.450) 0:02:34.353 ********** 2026-04-20 00:53:49.563065 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.563069 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.563073 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.563077 | orchestrator | 2026-04-20 00:53:49.563080 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2026-04-20 00:53:49.563084 | orchestrator | Monday 20 April 2026 00:50:58 +0000 (0:00:02.015) 0:02:36.368 ********** 2026-04-20 00:53:49.563088 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.563092 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.563095 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.563099 | orchestrator | 2026-04-20 00:53:49.563103 | orchestrator | TASK [include_role : magnum] *************************************************** 2026-04-20 00:53:49.563107 | orchestrator | Monday 20 April 2026 00:50:59 +0000 (0:00:00.624) 0:02:36.993 ********** 2026-04-20 00:53:49.563110 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.563114 | orchestrator | 2026-04-20 00:53:49.563118 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2026-04-20 00:53:49.563122 | orchestrator | Monday 20 April 2026 00:51:00 +0000 (0:00:00.964) 0:02:37.957 ********** 2026-04-20 00:53:49.563126 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.563132 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563137 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.563180 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563186 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.563205 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563209 | orchestrator | 2026-04-20 00:53:49.563216 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2026-04-20 00:53:49.563235 | orchestrator | Monday 20 April 2026 00:51:03 +0000 (0:00:03.681) 0:02:41.639 ********** 2026-04-20 00:53:49.563245 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.563296 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563303 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.563308 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.563312 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563316 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.563322 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-api:20.0.2.20260328', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.563333 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/magnum-conductor:20.0.2.20260328', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563337 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.563341 | orchestrator | 2026-04-20 00:53:49.563345 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2026-04-20 00:53:49.563378 | orchestrator | Monday 20 April 2026 00:51:04 +0000 (0:00:00.988) 0:02:42.628 ********** 2026-04-20 00:53:49.563383 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563389 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563394 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.563397 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563401 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563405 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.563409 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563413 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563416 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.563434 | orchestrator | 2026-04-20 00:53:49.563438 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2026-04-20 00:53:49.563442 | orchestrator | Monday 20 April 2026 00:51:05 +0000 (0:00:00.865) 0:02:43.494 ********** 2026-04-20 00:53:49.563446 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.563449 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.563453 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.563457 | orchestrator | 2026-04-20 00:53:49.563461 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2026-04-20 00:53:49.563464 | orchestrator | Monday 20 April 2026 00:51:06 +0000 (0:00:01.191) 0:02:44.685 ********** 2026-04-20 00:53:49.563468 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.563472 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.563475 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.563483 | orchestrator | 2026-04-20 00:53:49.563486 | orchestrator | TASK [include_role : manila] *************************************************** 2026-04-20 00:53:49.563490 | orchestrator | Monday 20 April 2026 00:51:09 +0000 (0:00:02.052) 0:02:46.738 ********** 2026-04-20 00:53:49.563494 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.563498 | orchestrator | 2026-04-20 00:53:49.563501 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2026-04-20 00:53:49.563505 | orchestrator | Monday 20 April 2026 00:51:10 +0000 (0:00:01.279) 0:02:48.017 ********** 2026-04-20 00:53:49.563512 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.563545 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563551 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.563556 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563561 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563570 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563574 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563598 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563628 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.563632 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563640 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563647 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563651 | orchestrator | 2026-04-20 00:53:49.563655 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2026-04-20 00:53:49.563659 | orchestrator | Monday 20 April 2026 00:51:14 +0000 (0:00:03.828) 0:02:51.846 ********** 2026-04-20 00:53:49.563689 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.563697 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563703 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563714 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563719 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.563728 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.563747 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563798 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563807 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563814 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.563819 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-api:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.563827 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-scheduler:20.0.2.20260328', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563834 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-share:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563838 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/2024.2/manila-data:20.0.2.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.563842 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.563846 | orchestrator | 2026-04-20 00:53:49.563849 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2026-04-20 00:53:49.563891 | orchestrator | Monday 20 April 2026 00:51:14 +0000 (0:00:00.685) 0:02:52.532 ********** 2026-04-20 00:53:49.563898 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563902 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563907 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.563911 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563915 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563922 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.563926 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563929 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.563933 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.563937 | orchestrator | 2026-04-20 00:53:49.563941 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2026-04-20 00:53:49.563944 | orchestrator | Monday 20 April 2026 00:51:16 +0000 (0:00:01.383) 0:02:53.915 ********** 2026-04-20 00:53:49.563948 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.563952 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.563956 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.563959 | orchestrator | 2026-04-20 00:53:49.563963 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2026-04-20 00:53:49.563967 | orchestrator | Monday 20 April 2026 00:51:17 +0000 (0:00:01.220) 0:02:55.135 ********** 2026-04-20 00:53:49.563971 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.563974 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.563978 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.563982 | orchestrator | 2026-04-20 00:53:49.563985 | orchestrator | TASK [include_role : mariadb] ************************************************** 2026-04-20 00:53:49.563991 | orchestrator | Monday 20 April 2026 00:51:19 +0000 (0:00:01.999) 0:02:57.135 ********** 2026-04-20 00:53:49.563997 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.564003 | orchestrator | 2026-04-20 00:53:49.564010 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2026-04-20 00:53:49.564016 | orchestrator | Monday 20 April 2026 00:51:20 +0000 (0:00:01.004) 0:02:58.139 ********** 2026-04-20 00:53:49.564022 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-20 00:53:49.564030 | orchestrator | 2026-04-20 00:53:49.564036 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2026-04-20 00:53:49.564043 | orchestrator | Monday 20 April 2026 00:51:22 +0000 (0:00:01.733) 0:02:59.872 ********** 2026-04-20 00:53:49.564099 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:53:49.564110 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-20 00:53:49.564114 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564121 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:53:49.564125 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-20 00:53:49.564129 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564163 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:53:49.564173 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-20 00:53:49.564177 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564181 | orchestrator | 2026-04-20 00:53:49.564185 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2026-04-20 00:53:49.564189 | orchestrator | Monday 20 April 2026 00:51:24 +0000 (0:00:02.125) 0:03:01.998 ********** 2026-04-20 00:53:49.564221 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:53:49.564230 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-20 00:53:49.564234 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564238 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:53:49.564244 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-20 00:53:49.564248 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564280 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:53:49.564289 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-clustercheck:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2026-04-20 00:53:49.564293 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564297 | orchestrator | 2026-04-20 00:53:49.564301 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2026-04-20 00:53:49.564305 | orchestrator | Monday 20 April 2026 00:51:27 +0000 (0:00:02.720) 0:03:04.718 ********** 2026-04-20 00:53:49.564310 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-20 00:53:49.564316 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-20 00:53:49.564320 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564324 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-20 00:53:49.564358 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-20 00:53:49.564364 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564368 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-20 00:53:49.564372 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2026-04-20 00:53:49.564376 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564380 | orchestrator | 2026-04-20 00:53:49.564383 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2026-04-20 00:53:49.564387 | orchestrator | Monday 20 April 2026 00:51:30 +0000 (0:00:03.448) 0:03:08.166 ********** 2026-04-20 00:53:49.564399 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.564403 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.564407 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.564410 | orchestrator | 2026-04-20 00:53:49.564414 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2026-04-20 00:53:49.564418 | orchestrator | Monday 20 April 2026 00:51:32 +0000 (0:00:02.056) 0:03:10.223 ********** 2026-04-20 00:53:49.564576 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564594 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564597 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564601 | orchestrator | 2026-04-20 00:53:49.564605 | orchestrator | TASK [include_role : masakari] ************************************************* 2026-04-20 00:53:49.564609 | orchestrator | Monday 20 April 2026 00:51:33 +0000 (0:00:01.379) 0:03:11.602 ********** 2026-04-20 00:53:49.564613 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564617 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564620 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564629 | orchestrator | 2026-04-20 00:53:49.564633 | orchestrator | TASK [include_role : memcached] ************************************************ 2026-04-20 00:53:49.564637 | orchestrator | Monday 20 April 2026 00:51:34 +0000 (0:00:00.271) 0:03:11.874 ********** 2026-04-20 00:53:49.564641 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.564644 | orchestrator | 2026-04-20 00:53:49.564648 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2026-04-20 00:53:49.564657 | orchestrator | Monday 20 April 2026 00:51:35 +0000 (0:00:01.014) 0:03:12.888 ********** 2026-04-20 00:53:49.564665 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-20 00:53:49.564744 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-20 00:53:49.564751 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2026-04-20 00:53:49.564755 | orchestrator | 2026-04-20 00:53:49.564759 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2026-04-20 00:53:49.564763 | orchestrator | Monday 20 April 2026 00:51:36 +0000 (0:00:01.746) 0:03:14.634 ********** 2026-04-20 00:53:49.564767 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-20 00:53:49.564771 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564775 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-20 00:53:49.564782 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564789 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/2024.2/memcached:1.6.24.20260328', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2026-04-20 00:53:49.564793 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564797 | orchestrator | 2026-04-20 00:53:49.564801 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2026-04-20 00:53:49.564805 | orchestrator | Monday 20 April 2026 00:51:37 +0000 (0:00:00.434) 0:03:15.069 ********** 2026-04-20 00:53:49.564809 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-04-20 00:53:49.564814 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564838 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-04-20 00:53:49.564843 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564847 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2026-04-20 00:53:49.564851 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564855 | orchestrator | 2026-04-20 00:53:49.564859 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2026-04-20 00:53:49.564862 | orchestrator | Monday 20 April 2026 00:51:37 +0000 (0:00:00.482) 0:03:15.551 ********** 2026-04-20 00:53:49.564866 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564870 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564874 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564877 | orchestrator | 2026-04-20 00:53:49.564881 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2026-04-20 00:53:49.564885 | orchestrator | Monday 20 April 2026 00:51:38 +0000 (0:00:00.608) 0:03:16.159 ********** 2026-04-20 00:53:49.564888 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564892 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564896 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564900 | orchestrator | 2026-04-20 00:53:49.564903 | orchestrator | TASK [include_role : mistral] ************************************************** 2026-04-20 00:53:49.564907 | orchestrator | Monday 20 April 2026 00:51:39 +0000 (0:00:01.002) 0:03:17.162 ********** 2026-04-20 00:53:49.564911 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.564915 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.564918 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.564926 | orchestrator | 2026-04-20 00:53:49.564930 | orchestrator | TASK [include_role : neutron] ************************************************** 2026-04-20 00:53:49.564934 | orchestrator | Monday 20 April 2026 00:51:39 +0000 (0:00:00.247) 0:03:17.409 ********** 2026-04-20 00:53:49.564938 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.564941 | orchestrator | 2026-04-20 00:53:49.564945 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2026-04-20 00:53:49.564949 | orchestrator | Monday 20 April 2026 00:51:40 +0000 (0:00:01.215) 0:03:18.625 ********** 2026-04-20 00:53:49.564954 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.564961 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.564997 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-20 00:53:49.565003 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-20 00:53:49.565013 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565018 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565025 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565029 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-20 00:53:49.565067 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565077 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565092 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-20 00:53:49.565099 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565109 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565116 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.565162 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release/2024.2/neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565170 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.565183 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.565193 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565236 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565244 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-20 00:53:49.565255 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-20 00:53:49.565268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-20 00:53:49.565303 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-20 00:53:49.565309 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565318 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565322 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565326 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565333 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565337 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565370 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-20 00:53:49.565375 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-20 00:53:49.565383 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565388 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565394 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565399 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565463 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-20 00:53:49.565475 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-20 00:53:49.565479 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565483 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565487 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565494 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565525 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.565536 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.565541 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release/2024.2/neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565545 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release/2024.2/neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565548 | orchestrator | 2026-04-20 00:53:49.565552 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2026-04-20 00:53:49.565556 | orchestrator | Monday 20 April 2026 00:51:45 +0000 (0:00:04.268) 0:03:22.893 ********** 2026-04-20 00:53:49.565564 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.565598 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565604 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-20 00:53:49.565608 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-20 00:53:49.565617 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565621 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565659 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565665 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-20 00:53:49.565669 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.565674 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565681 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565704 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565713 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-20 00:53:49.565717 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-20 00:53:49.565721 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-20 00:53:49.565728 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-server:26.0.3.20260328', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.565753 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565758 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565762 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-openvswitch-agent:26.0.3.20260328', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565766 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565770 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565818 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}, 'pid_mode': '', 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-dhcp-agent:26.0.3.20260328', 'KOLLA_NAME': 'neutron_dhcp_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}}})  2026-04-20 00:53:49.565886 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.565892 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565897 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'cgroupns_mode': 'private', 'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_IMAGE': 'registry.osism.tech/kolla/release/2024.2/neutron-l3-agent:26.0.3.20260328', 'KOLLA_LEGACY_IPTABLES': 'false', 'KOLLA_NAME': 'neutron_l3_agent', 'KOLLA_NEUTRON_WRAPPERS': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', '', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}, 'pid_mode': ''}})  2026-04-20 00:53:49.565904 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release/2024.2/neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565908 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.565913 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-20 00:53:49.565939 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-sriov-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565945 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.565949 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-mlnx-agent:26.0.3.20260328', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565953 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.565957 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-eswitchd:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.565963 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-20 00:53:49.565971 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': 'NONE', 'timeout': '30'}}})  2026-04-20 00:53:49.566038 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.566046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metadata-agent:26.0.3.20260328', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.566050 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566054 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-bgp-dragent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566061 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.566100 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-infoblox-ipam-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}}})  2026-04-20 00:53:49.566107 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release/2024.2/neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.566111 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-metering-agent:26.0.3.20260328', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}}})  2026-04-20 00:53:49.566115 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.566118 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/2024.2/ironic-neutron-agent:26.0.3.20260328', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566125 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/neutron-tls-proxy:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2026-04-20 00:53:49.566133 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/release/2024.2/neutron-ovn-agent:26.0.3.20260328', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2026-04-20 00:53:49.566137 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.566140 | orchestrator | 2026-04-20 00:53:49.566151 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2026-04-20 00:53:49.566156 | orchestrator | Monday 20 April 2026 00:51:46 +0000 (0:00:01.221) 0:03:24.114 ********** 2026-04-20 00:53:49.566188 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566194 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566198 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.566202 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566206 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566210 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.566213 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566217 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566221 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.566225 | orchestrator | 2026-04-20 00:53:49.566229 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2026-04-20 00:53:49.566239 | orchestrator | Monday 20 April 2026 00:51:47 +0000 (0:00:01.254) 0:03:25.368 ********** 2026-04-20 00:53:49.566243 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.566247 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.566250 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.566254 | orchestrator | 2026-04-20 00:53:49.566258 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2026-04-20 00:53:49.566262 | orchestrator | Monday 20 April 2026 00:51:49 +0000 (0:00:01.503) 0:03:26.871 ********** 2026-04-20 00:53:49.566269 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.566272 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.566276 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.566280 | orchestrator | 2026-04-20 00:53:49.566283 | orchestrator | TASK [include_role : placement] ************************************************ 2026-04-20 00:53:49.566287 | orchestrator | Monday 20 April 2026 00:51:51 +0000 (0:00:01.911) 0:03:28.783 ********** 2026-04-20 00:53:49.566291 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.566295 | orchestrator | 2026-04-20 00:53:49.566298 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2026-04-20 00:53:49.566302 | orchestrator | Monday 20 April 2026 00:51:52 +0000 (0:00:01.123) 0:03:29.906 ********** 2026-04-20 00:53:49.566309 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/2024.2/placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-20 00:53:49.566342 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/2024.2/placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-20 00:53:49.566348 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/2024.2/placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-20 00:53:49.566357 | orchestrator | 2026-04-20 00:53:49.566361 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2026-04-20 00:53:49.566365 | orchestrator | Monday 20 April 2026 00:51:55 +0000 (0:00:03.050) 0:03:32.957 ********** 2026-04-20 00:53:49.566369 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/2024.2/placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-20 00:53:49.566373 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.566380 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/2024.2/placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-20 00:53:49.566384 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.566416 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/2024.2/placement-api:13.0.0.20260328', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'wsgi': 'placement.wsgi.api:application', 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-20 00:53:49.566447 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.566452 | orchestrator | 2026-04-20 00:53:49.566456 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2026-04-20 00:53:49.566459 | orchestrator | Monday 20 April 2026 00:51:56 +0000 (0:00:00.919) 0:03:33.876 ********** 2026-04-20 00:53:49.566463 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.566474 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.566479 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.566483 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.566487 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.566491 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.566495 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.566499 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.566503 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.566507 | orchestrator | 2026-04-20 00:53:49.566513 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2026-04-20 00:53:49.566517 | orchestrator | Monday 20 April 2026 00:51:56 +0000 (0:00:00.750) 0:03:34.626 ********** 2026-04-20 00:53:49.566521 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.566525 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.566528 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.566532 | orchestrator | 2026-04-20 00:53:49.566536 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2026-04-20 00:53:49.566540 | orchestrator | Monday 20 April 2026 00:51:58 +0000 (0:00:01.167) 0:03:35.794 ********** 2026-04-20 00:53:49.566544 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.566547 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.566551 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.566555 | orchestrator | 2026-04-20 00:53:49.566559 | orchestrator | TASK [include_role : nova] ***************************************************** 2026-04-20 00:53:49.566562 | orchestrator | Monday 20 April 2026 00:51:59 +0000 (0:00:01.828) 0:03:37.623 ********** 2026-04-20 00:53:49.566566 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.566570 | orchestrator | 2026-04-20 00:53:49.566574 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2026-04-20 00:53:49.566578 | orchestrator | Monday 20 April 2026 00:52:01 +0000 (0:00:01.293) 0:03:38.917 ********** 2026-04-20 00:53:49.566612 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.566622 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.566629 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.566634 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.566657 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566680 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566685 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.566690 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566696 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566730 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.566739 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566743 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566747 | orchestrator | 2026-04-20 00:53:49.566751 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2026-04-20 00:53:49.566755 | orchestrator | Monday 20 April 2026 00:52:06 +0000 (0:00:04.847) 0:03:43.764 ********** 2026-04-20 00:53:49.566761 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.566766 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.566792 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566797 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566801 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.566805 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.566812 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.566817 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566836 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566840 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.566844 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.osapi_compute:application', 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.566848 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-metadata', 'value': {'container_name': 'nova_metadata', 'group': 'nova-metadata', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-api:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-metadata/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8775 '], 'timeout': '30'}, 'wsgi': 'nova.wsgi.metadata:application', 'haproxy': {'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.566855 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-scheduler:31.2.1.20260328', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566859 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/2024.2/nova-super-conductor:31.2.1.20260328', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.566866 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.566870 | orchestrator | 2026-04-20 00:53:49.566874 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2026-04-20 00:53:49.566889 | orchestrator | Monday 20 April 2026 00:52:06 +0000 (0:00:00.694) 0:03:44.459 ********** 2026-04-20 00:53:49.566894 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566898 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566902 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566906 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566910 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.566914 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566918 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566922 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566926 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566930 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566933 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.566937 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566943 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566947 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.566955 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.566959 | orchestrator | 2026-04-20 00:53:49.566962 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2026-04-20 00:53:49.566966 | orchestrator | Monday 20 April 2026 00:52:08 +0000 (0:00:01.571) 0:03:46.030 ********** 2026-04-20 00:53:49.566970 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.566974 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.566977 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.566981 | orchestrator | 2026-04-20 00:53:49.566985 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2026-04-20 00:53:49.566989 | orchestrator | Monday 20 April 2026 00:52:09 +0000 (0:00:01.185) 0:03:47.216 ********** 2026-04-20 00:53:49.566992 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.566996 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.567000 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.567010 | orchestrator | 2026-04-20 00:53:49.567015 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2026-04-20 00:53:49.567018 | orchestrator | Monday 20 April 2026 00:52:11 +0000 (0:00:01.967) 0:03:49.183 ********** 2026-04-20 00:53:49.567022 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.567026 | orchestrator | 2026-04-20 00:53:49.567042 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2026-04-20 00:53:49.567047 | orchestrator | Monday 20 April 2026 00:52:12 +0000 (0:00:01.272) 0:03:50.455 ********** 2026-04-20 00:53:49.567051 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2026-04-20 00:53:49.567055 | orchestrator | 2026-04-20 00:53:49.567059 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2026-04-20 00:53:49.567063 | orchestrator | Monday 20 April 2026 00:52:13 +0000 (0:00:00.988) 0:03:51.444 ********** 2026-04-20 00:53:49.567067 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-04-20 00:53:49.567072 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-04-20 00:53:49.567076 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2026-04-20 00:53:49.567080 | orchestrator | 2026-04-20 00:53:49.567084 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2026-04-20 00:53:49.567088 | orchestrator | Monday 20 April 2026 00:52:17 +0000 (0:00:03.274) 0:03:54.719 ********** 2026-04-20 00:53:49.567103 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567108 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567115 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567119 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567123 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567127 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567131 | orchestrator | 2026-04-20 00:53:49.567135 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2026-04-20 00:53:49.567139 | orchestrator | Monday 20 April 2026 00:52:18 +0000 (0:00:01.126) 0:03:55.845 ********** 2026-04-20 00:53:49.567155 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-20 00:53:49.567160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-20 00:53:49.567164 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567168 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-20 00:53:49.567175 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-20 00:53:49.567179 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-20 00:53:49.567183 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567187 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2026-04-20 00:53:49.567191 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567195 | orchestrator | 2026-04-20 00:53:49.567200 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-04-20 00:53:49.567207 | orchestrator | Monday 20 April 2026 00:52:19 +0000 (0:00:01.347) 0:03:57.192 ********** 2026-04-20 00:53:49.567211 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.567215 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.567219 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.567223 | orchestrator | 2026-04-20 00:53:49.567227 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-04-20 00:53:49.567231 | orchestrator | Monday 20 April 2026 00:52:21 +0000 (0:00:01.996) 0:03:59.188 ********** 2026-04-20 00:53:49.567235 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.567239 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.567243 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.567247 | orchestrator | 2026-04-20 00:53:49.567251 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2026-04-20 00:53:49.567255 | orchestrator | Monday 20 April 2026 00:52:24 +0000 (0:00:02.636) 0:04:01.824 ********** 2026-04-20 00:53:49.567259 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2026-04-20 00:53:49.567262 | orchestrator | 2026-04-20 00:53:49.567266 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2026-04-20 00:53:49.567270 | orchestrator | Monday 20 April 2026 00:52:24 +0000 (0:00:00.764) 0:04:02.589 ********** 2026-04-20 00:53:49.567277 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567281 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567285 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567289 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567305 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567310 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567314 | orchestrator | 2026-04-20 00:53:49.567318 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2026-04-20 00:53:49.567322 | orchestrator | Monday 20 April 2026 00:52:26 +0000 (0:00:01.180) 0:04:03.770 ********** 2026-04-20 00:53:49.567326 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567333 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567337 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567341 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567345 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2026-04-20 00:53:49.567349 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567353 | orchestrator | 2026-04-20 00:53:49.567357 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2026-04-20 00:53:49.567361 | orchestrator | Monday 20 April 2026 00:52:27 +0000 (0:00:01.116) 0:04:04.886 ********** 2026-04-20 00:53:49.567365 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567369 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567373 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567377 | orchestrator | 2026-04-20 00:53:49.567381 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-04-20 00:53:49.567385 | orchestrator | Monday 20 April 2026 00:52:28 +0000 (0:00:01.258) 0:04:06.145 ********** 2026-04-20 00:53:49.567389 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.567393 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.567397 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.567401 | orchestrator | 2026-04-20 00:53:49.567405 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-04-20 00:53:49.567409 | orchestrator | Monday 20 April 2026 00:52:30 +0000 (0:00:02.244) 0:04:08.390 ********** 2026-04-20 00:53:49.567413 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.567420 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.567442 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.567447 | orchestrator | 2026-04-20 00:53:49.567451 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2026-04-20 00:53:49.567456 | orchestrator | Monday 20 April 2026 00:52:33 +0000 (0:00:02.614) 0:04:11.004 ********** 2026-04-20 00:53:49.567460 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2026-04-20 00:53:49.567464 | orchestrator | 2026-04-20 00:53:49.567469 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2026-04-20 00:53:49.567473 | orchestrator | Monday 20 April 2026 00:52:34 +0000 (0:00:01.060) 0:04:12.065 ********** 2026-04-20 00:53:49.567477 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-20 00:53:49.567481 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567500 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-20 00:53:49.567509 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567513 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-20 00:53:49.567517 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567522 | orchestrator | 2026-04-20 00:53:49.567526 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2026-04-20 00:53:49.567530 | orchestrator | Monday 20 April 2026 00:52:35 +0000 (0:00:00.981) 0:04:13.047 ********** 2026-04-20 00:53:49.567535 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-20 00:53:49.567539 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567544 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-20 00:53:49.567548 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567555 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2026-04-20 00:53:49.567560 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567564 | orchestrator | 2026-04-20 00:53:49.567568 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2026-04-20 00:53:49.567573 | orchestrator | Monday 20 April 2026 00:52:36 +0000 (0:00:01.070) 0:04:14.117 ********** 2026-04-20 00:53:49.567577 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567581 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567585 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567590 | orchestrator | 2026-04-20 00:53:49.567594 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2026-04-20 00:53:49.567598 | orchestrator | Monday 20 April 2026 00:52:37 +0000 (0:00:01.295) 0:04:15.413 ********** 2026-04-20 00:53:49.567605 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.567610 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.567614 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.567619 | orchestrator | 2026-04-20 00:53:49.567623 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2026-04-20 00:53:49.567627 | orchestrator | Monday 20 April 2026 00:52:40 +0000 (0:00:02.335) 0:04:17.749 ********** 2026-04-20 00:53:49.567631 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.567635 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.567640 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.567644 | orchestrator | 2026-04-20 00:53:49.567648 | orchestrator | TASK [include_role : octavia] ************************************************** 2026-04-20 00:53:49.567652 | orchestrator | Monday 20 April 2026 00:52:42 +0000 (0:00:02.900) 0:04:20.649 ********** 2026-04-20 00:53:49.567657 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.567661 | orchestrator | 2026-04-20 00:53:49.567665 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2026-04-20 00:53:49.567670 | orchestrator | Monday 20 April 2026 00:52:44 +0000 (0:00:01.205) 0:04:21.855 ********** 2026-04-20 00:53:49.567686 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-04-20 00:53:49.567692 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-20 00:53:49.567696 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567704 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567712 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.567728 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-04-20 00:53:49.567734 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-20 00:53:49.567738 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567743 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567747 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.567759 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2026-04-20 00:53:49.567775 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-20 00:53:49.567780 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567785 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567789 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.567794 | orchestrator | 2026-04-20 00:53:49.567798 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2026-04-20 00:53:49.567803 | orchestrator | Monday 20 April 2026 00:52:47 +0000 (0:00:03.134) 0:04:24.989 ********** 2026-04-20 00:53:49.567813 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-04-20 00:53:49.567829 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-04-20 00:53:49.567834 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-20 00:53:49.567838 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-20 00:53:49.567842 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567847 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567858 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567862 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567877 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.567882 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567886 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.567890 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567894 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-api:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2026-04-20 00:53:49.567901 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-driver-agent:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2026-04-20 00:53:49.567908 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-health-manager:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567912 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-housekeeping:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2026-04-20 00:53:49.567927 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/octavia-worker:16.0.2.20260328', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2026-04-20 00:53:49.567931 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567935 | orchestrator | 2026-04-20 00:53:49.567939 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2026-04-20 00:53:49.567943 | orchestrator | Monday 20 April 2026 00:52:48 +0000 (0:00:00.754) 0:04:25.743 ********** 2026-04-20 00:53:49.567948 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-20 00:53:49.567952 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-20 00:53:49.567955 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.567959 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-20 00:53:49.567963 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-20 00:53:49.567971 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.567975 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-20 00:53:49.567978 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2026-04-20 00:53:49.567982 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.567986 | orchestrator | 2026-04-20 00:53:49.567990 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2026-04-20 00:53:49.567994 | orchestrator | Monday 20 April 2026 00:52:48 +0000 (0:00:00.849) 0:04:26.593 ********** 2026-04-20 00:53:49.567997 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.568001 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.568005 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.568009 | orchestrator | 2026-04-20 00:53:49.568012 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2026-04-20 00:53:49.568016 | orchestrator | Monday 20 April 2026 00:52:50 +0000 (0:00:01.244) 0:04:27.838 ********** 2026-04-20 00:53:49.568020 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.568023 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.568027 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.568031 | orchestrator | 2026-04-20 00:53:49.568035 | orchestrator | TASK [include_role : opensearch] *********************************************** 2026-04-20 00:53:49.568041 | orchestrator | Monday 20 April 2026 00:52:52 +0000 (0:00:02.085) 0:04:29.923 ********** 2026-04-20 00:53:49.568045 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.568048 | orchestrator | 2026-04-20 00:53:49.568052 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2026-04-20 00:53:49.568056 | orchestrator | Monday 20 April 2026 00:52:53 +0000 (0:00:01.484) 0:04:31.408 ********** 2026-04-20 00:53:49.568061 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.568077 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.568082 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.568092 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:53:49.568096 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:53:49.568113 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:53:49.568121 | orchestrator | 2026-04-20 00:53:49.568125 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2026-04-20 00:53:49.568129 | orchestrator | Monday 20 April 2026 00:52:59 +0000 (0:00:05.363) 0:04:36.771 ********** 2026-04-20 00:53:49.568133 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.568140 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:53:49.568144 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568159 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.568164 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:53:49.568170 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568174 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.568181 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:53:49.568185 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568189 | orchestrator | 2026-04-20 00:53:49.568193 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2026-04-20 00:53:49.568197 | orchestrator | Monday 20 April 2026 00:52:59 +0000 (0:00:00.648) 0:04:37.419 ********** 2026-04-20 00:53:49.568201 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568216 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-20 00:53:49.568224 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-20 00:53:49.568229 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568232 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568236 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-20 00:53:49.568240 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568244 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-20 00:53:49.568248 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568252 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-20 00:53:49.568256 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}})  2026-04-20 00:53:49.568260 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568263 | orchestrator | 2026-04-20 00:53:49.568267 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2026-04-20 00:53:49.568271 | orchestrator | Monday 20 April 2026 00:53:01 +0000 (0:00:01.292) 0:04:38.712 ********** 2026-04-20 00:53:49.568275 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568278 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568282 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568286 | orchestrator | 2026-04-20 00:53:49.568289 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2026-04-20 00:53:49.568296 | orchestrator | Monday 20 April 2026 00:53:01 +0000 (0:00:00.431) 0:04:39.144 ********** 2026-04-20 00:53:49.568300 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568303 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568307 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568311 | orchestrator | 2026-04-20 00:53:49.568315 | orchestrator | TASK [include_role : prometheus] *********************************************** 2026-04-20 00:53:49.568318 | orchestrator | Monday 20 April 2026 00:53:02 +0000 (0:00:01.259) 0:04:40.403 ********** 2026-04-20 00:53:49.568322 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.568326 | orchestrator | 2026-04-20 00:53:49.568330 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2026-04-20 00:53:49.568333 | orchestrator | Monday 20 April 2026 00:53:04 +0000 (0:00:01.644) 0:04:42.048 ********** 2026-04-20 00:53:49.568349 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 00:53:49.568357 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 00:53:49.568361 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 00:53:49.568366 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 00:53:49.568374 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568378 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568397 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568402 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568406 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568410 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568415 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 00:53:49.568419 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 00:53:49.568449 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568466 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568471 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568513 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.568526 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-20 00:53:49.568531 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.568553 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568557 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-20 00:53:49.568561 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568565 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568572 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568590 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:53:49.568595 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568599 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568603 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-20 00:53:49.568608 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568614 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568624 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568628 | orchestrator | 2026-04-20 00:53:49.568631 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2026-04-20 00:53:49.568635 | orchestrator | Monday 20 April 2026 00:53:08 +0000 (0:00:04.256) 0:04:46.304 ********** 2026-04-20 00:53:49.568650 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-20 00:53:49.568655 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 00:53:49.568659 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568663 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568669 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568677 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.568692 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-20 00:53:49.568697 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-20 00:53:49.568701 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 00:53:49.568710 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568714 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568730 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568735 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568739 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568743 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568747 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568751 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.568761 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-20 00:53:49.568768 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-20 00:53:49.568772 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568776 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 00:53:49.568780 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568791 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568795 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568799 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568803 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568810 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568814 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:53:49.568818 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-openstack-exporter:1.7.0.20260328', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['option httpchk', 'timeout server 45s']}}}})  2026-04-20 00:53:49.568828 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568832 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 00:53:49.568836 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 00:53:49.568840 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568844 | orchestrator | 2026-04-20 00:53:49.568850 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2026-04-20 00:53:49.568854 | orchestrator | Monday 20 April 2026 00:53:09 +0000 (0:00:00.882) 0:04:47.186 ********** 2026-04-20 00:53:49.568858 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-20 00:53:49.568862 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-20 00:53:49.568866 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568870 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568877 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568881 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-20 00:53:49.568885 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-20 00:53:49.568889 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568895 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568899 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568903 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-20 00:53:49.568907 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}})  2026-04-20 00:53:49.568913 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568917 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}})  2026-04-20 00:53:49.568921 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568925 | orchestrator | 2026-04-20 00:53:49.568929 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2026-04-20 00:53:49.568932 | orchestrator | Monday 20 April 2026 00:53:10 +0000 (0:00:01.419) 0:04:48.606 ********** 2026-04-20 00:53:49.568936 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568940 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568944 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568947 | orchestrator | 2026-04-20 00:53:49.568951 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2026-04-20 00:53:49.568958 | orchestrator | Monday 20 April 2026 00:53:11 +0000 (0:00:00.473) 0:04:49.079 ********** 2026-04-20 00:53:49.568962 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.568966 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.568969 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.568973 | orchestrator | 2026-04-20 00:53:49.568977 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2026-04-20 00:53:49.568981 | orchestrator | Monday 20 April 2026 00:53:12 +0000 (0:00:01.337) 0:04:50.417 ********** 2026-04-20 00:53:49.568984 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.568988 | orchestrator | 2026-04-20 00:53:49.568992 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2026-04-20 00:53:49.568996 | orchestrator | Monday 20 April 2026 00:53:14 +0000 (0:00:01.377) 0:04:51.794 ********** 2026-04-20 00:53:49.569000 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:53:49.569007 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:53:49.569014 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2026-04-20 00:53:49.569018 | orchestrator | 2026-04-20 00:53:49.569022 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2026-04-20 00:53:49.569029 | orchestrator | Monday 20 April 2026 00:53:16 +0000 (0:00:02.779) 0:04:54.574 ********** 2026-04-20 00:53:49.569033 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:53:49.569037 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569041 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:53:49.569045 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569052 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/rabbitmq:4.1.8.20260328', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2026-04-20 00:53:49.569056 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569060 | orchestrator | 2026-04-20 00:53:49.569064 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2026-04-20 00:53:49.569068 | orchestrator | Monday 20 April 2026 00:53:17 +0000 (0:00:00.403) 0:04:54.977 ********** 2026-04-20 00:53:49.569072 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-04-20 00:53:49.569076 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569081 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-04-20 00:53:49.569089 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569093 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2026-04-20 00:53:49.569097 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569100 | orchestrator | 2026-04-20 00:53:49.569104 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2026-04-20 00:53:49.569108 | orchestrator | Monday 20 April 2026 00:53:17 +0000 (0:00:00.622) 0:04:55.599 ********** 2026-04-20 00:53:49.569112 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569115 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569119 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569123 | orchestrator | 2026-04-20 00:53:49.569127 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2026-04-20 00:53:49.569130 | orchestrator | Monday 20 April 2026 00:53:18 +0000 (0:00:00.454) 0:04:56.054 ********** 2026-04-20 00:53:49.569134 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569138 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569142 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569145 | orchestrator | 2026-04-20 00:53:49.569149 | orchestrator | TASK [include_role : skyline] ************************************************** 2026-04-20 00:53:49.569153 | orchestrator | Monday 20 April 2026 00:53:19 +0000 (0:00:01.416) 0:04:57.471 ********** 2026-04-20 00:53:49.569157 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.569160 | orchestrator | 2026-04-20 00:53:49.569164 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2026-04-20 00:53:49.569168 | orchestrator | Monday 20 April 2026 00:53:21 +0000 (0:00:01.821) 0:04:59.293 ********** 2026-04-20 00:53:49.569172 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-04-20 00:53:49.569178 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-04-20 00:53:49.569189 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}}) 2026-04-20 00:53:49.569194 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-20 00:53:49.569198 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-20 00:53:49.569204 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}}) 2026-04-20 00:53:49.569212 | orchestrator | 2026-04-20 00:53:49.569216 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2026-04-20 00:53:49.569220 | orchestrator | Monday 20 April 2026 00:53:27 +0000 (0:00:05.412) 0:05:04.705 ********** 2026-04-20 00:53:49.569227 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-04-20 00:53:49.569231 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-20 00:53:49.569236 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569240 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-04-20 00:53:49.569247 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-20 00:53:49.569255 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569262 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-apiserver:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}}}})  2026-04-20 00:53:49.569266 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/skyline-console:6.0.1.20260328', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}}}})  2026-04-20 00:53:49.569271 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569275 | orchestrator | 2026-04-20 00:53:49.569279 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2026-04-20 00:53:49.569282 | orchestrator | Monday 20 April 2026 00:53:27 +0000 (0:00:00.862) 0:05:05.568 ********** 2026-04-20 00:53:49.569286 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-20 00:53:49.569291 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-20 00:53:49.569298 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.569305 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.569309 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569313 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-20 00:53:49.569317 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-20 00:53:49.569321 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.569326 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.569330 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569334 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-20 00:53:49.569338 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /docs']}})  2026-04-20 00:53:49.569342 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.569346 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no', 'backend_http_extra': ['option httpchk GET /']}})  2026-04-20 00:53:49.569360 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569364 | orchestrator | 2026-04-20 00:53:49.569367 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2026-04-20 00:53:49.569371 | orchestrator | Monday 20 April 2026 00:53:28 +0000 (0:00:01.116) 0:05:06.684 ********** 2026-04-20 00:53:49.569375 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.569379 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.569383 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.569386 | orchestrator | 2026-04-20 00:53:49.569390 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2026-04-20 00:53:49.569394 | orchestrator | Monday 20 April 2026 00:53:30 +0000 (0:00:01.254) 0:05:07.939 ********** 2026-04-20 00:53:49.569398 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:53:49.569402 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:53:49.569405 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:53:49.569409 | orchestrator | 2026-04-20 00:53:49.569413 | orchestrator | TASK [include_role : tacker] *************************************************** 2026-04-20 00:53:49.569420 | orchestrator | Monday 20 April 2026 00:53:32 +0000 (0:00:02.023) 0:05:09.963 ********** 2026-04-20 00:53:49.569439 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569450 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569454 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569458 | orchestrator | 2026-04-20 00:53:49.569462 | orchestrator | TASK [include_role : trove] **************************************************** 2026-04-20 00:53:49.569466 | orchestrator | Monday 20 April 2026 00:53:32 +0000 (0:00:00.296) 0:05:10.260 ********** 2026-04-20 00:53:49.569469 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569473 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569477 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569481 | orchestrator | 2026-04-20 00:53:49.569488 | orchestrator | TASK [include_role : venus] **************************************************** 2026-04-20 00:53:49.569494 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.484) 0:05:10.744 ********** 2026-04-20 00:53:49.569500 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569506 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569511 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569519 | orchestrator | 2026-04-20 00:53:49.569530 | orchestrator | TASK [include_role : watcher] ************************************************** 2026-04-20 00:53:49.569540 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.279) 0:05:11.023 ********** 2026-04-20 00:53:49.569548 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569554 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569560 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569566 | orchestrator | 2026-04-20 00:53:49.569572 | orchestrator | TASK [include_role : zun] ****************************************************** 2026-04-20 00:53:49.569578 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.277) 0:05:11.301 ********** 2026-04-20 00:53:49.569584 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569590 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569597 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569603 | orchestrator | 2026-04-20 00:53:49.569609 | orchestrator | TASK [include_role : loadbalancer] ********************************************* 2026-04-20 00:53:49.569616 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.265) 0:05:11.567 ********** 2026-04-20 00:53:49.569623 | orchestrator | included: loadbalancer for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:53:49.569629 | orchestrator | 2026-04-20 00:53:49.569635 | orchestrator | TASK [service-check-containers : loadbalancer | Check containers] ************** 2026-04-20 00:53:49.569641 | orchestrator | Monday 20 April 2026 00:53:35 +0000 (0:00:01.586) 0:05:13.154 ********** 2026-04-20 00:53:49.569652 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.569659 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.569671 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2026-04-20 00:53:49.569678 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.569689 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.569696 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2026-04-20 00:53:49.569703 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.569715 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.569719 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2026-04-20 00:53:49.569727 | orchestrator | 2026-04-20 00:53:49.569731 | orchestrator | TASK [service-check-containers : loadbalancer | Notify handlers to restart containers] *** 2026-04-20 00:53:49.569735 | orchestrator | Monday 20 April 2026 00:53:37 +0000 (0:00:02.355) 0:05:15.509 ********** 2026-04-20 00:53:49.569739 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:53:49.569742 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:53:49.569746 | orchestrator | } 2026-04-20 00:53:49.569750 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:53:49.569754 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:53:49.569758 | orchestrator | } 2026-04-20 00:53:49.569761 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:53:49.569765 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:53:49.569769 | orchestrator | } 2026-04-20 00:53:49.569773 | orchestrator | 2026-04-20 00:53:49.569776 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:53:49.569780 | orchestrator | Monday 20 April 2026 00:53:38 +0000 (0:00:00.323) 0:05:15.833 ********** 2026-04-20 00:53:49.569784 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.569791 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.569795 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.569799 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:53:49.569805 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.569809 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.569816 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.569820 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:53:49.569824 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/haproxy:2.8.16.20260328', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2026-04-20 00:53:49.569828 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/proxysql:3.0.6.20260328', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2026-04-20 00:53:49.569834 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keepalived:2.2.8.20260328', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2026-04-20 00:53:49.569838 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:53:49.569842 | orchestrator | 2026-04-20 00:53:49.569846 | orchestrator | RUNNING HANDLER [loadbalancer : Check IP addresses on the API interface] ******* 2026-04-20 00:53:49.569849 | orchestrator | Monday 20 April 2026 00:53:39 +0000 (0:00:01.527) 0:05:17.361 ********** 2026-04-20 00:53:49.569853 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.569857 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.569861 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.569865 | orchestrator | 2026-04-20 00:53:49.569869 | orchestrator | RUNNING HANDLER [loadbalancer : Group HA nodes by status] ********************** 2026-04-20 00:53:49.569873 | orchestrator | Monday 20 April 2026 00:53:40 +0000 (0:00:00.815) 0:05:18.176 ********** 2026-04-20 00:53:49.569876 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.569880 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.569887 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.569891 | orchestrator | 2026-04-20 00:53:49.569895 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup keepalived container] ************** 2026-04-20 00:53:49.569899 | orchestrator | Monday 20 April 2026 00:53:40 +0000 (0:00:00.310) 0:05:18.487 ********** 2026-04-20 00:53:49.569902 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.569906 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.569910 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.569913 | orchestrator | 2026-04-20 00:53:49.569917 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup haproxy container] ***************** 2026-04-20 00:53:49.569923 | orchestrator | Monday 20 April 2026 00:53:41 +0000 (0:00:00.910) 0:05:19.397 ********** 2026-04-20 00:53:49.569927 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.569931 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.569934 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.569938 | orchestrator | 2026-04-20 00:53:49.569942 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup proxysql container] **************** 2026-04-20 00:53:49.569946 | orchestrator | Monday 20 April 2026 00:53:42 +0000 (0:00:00.880) 0:05:20.277 ********** 2026-04-20 00:53:49.569949 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:53:49.569953 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:53:49.569957 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:53:49.569961 | orchestrator | 2026-04-20 00:53:49.569964 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup haproxy container] **************** 2026-04-20 00:53:49.569968 | orchestrator | Monday 20 April 2026 00:53:43 +0000 (0:00:00.867) 0:05:21.145 ********** 2026-04-20 00:53:49.569976 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fhaproxy\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_bfsly8py/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_bfsly8py/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_bfsly8py/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_bfsly8py/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fhaproxy: Internal Server Error (\"unknown: repository kolla/release/2024.2/haproxy not found\")\\n'"} 2026-04-20 00:53:49.569991 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fhaproxy\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_1eicx5qn/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_1eicx5qn/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_1eicx5qn/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_1eicx5qn/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fhaproxy: Internal Server Error (\"unknown: repository kolla/release/2024.2/haproxy not found\")\\n'"} 2026-04-20 00:53:49.569999 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fhaproxy\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_3bde5eat/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_3bde5eat/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_3bde5eat/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_3bde5eat/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=2.8.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fhaproxy: Internal Server Error (\"unknown: repository kolla/release/2024.2/haproxy not found\")\\n'"} 2026-04-20 00:53:49.570009 | orchestrator | 2026-04-20 00:53:49.570041 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:53:49.570048 | orchestrator | testbed-node-0 : ok=120  changed=76  unreachable=0 failed=1  skipped=88  rescued=0 ignored=0 2026-04-20 00:53:49.570053 | orchestrator | testbed-node-1 : ok=119  changed=76  unreachable=0 failed=1  skipped=88  rescued=0 ignored=0 2026-04-20 00:53:49.570057 | orchestrator | testbed-node-2 : ok=119  changed=76  unreachable=0 failed=1  skipped=88  rescued=0 ignored=0 2026-04-20 00:53:49.570060 | orchestrator | 2026-04-20 00:53:49.570064 | orchestrator | 2026-04-20 00:53:49.570068 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:53:49.570072 | orchestrator | Monday 20 April 2026 00:53:46 +0000 (0:00:02.668) 0:05:23.814 ********** 2026-04-20 00:53:49.570075 | orchestrator | =============================================================================== 2026-04-20 00:53:49.570079 | orchestrator | haproxy-config : Copying over skyline haproxy config -------------------- 5.41s 2026-04-20 00:53:49.570083 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 5.36s 2026-04-20 00:53:49.570087 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 4.97s 2026-04-20 00:53:49.570090 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 4.85s 2026-04-20 00:53:49.570094 | orchestrator | haproxy-config : Copying over designate haproxy config ------------------ 4.82s 2026-04-20 00:53:49.570098 | orchestrator | sysctl : Setting sysctl values ------------------------------------------ 4.67s 2026-04-20 00:53:49.570102 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 4.27s 2026-04-20 00:53:49.570105 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 4.26s 2026-04-20 00:53:49.570109 | orchestrator | loadbalancer : Copying checks for services which are enabled ------------ 4.19s 2026-04-20 00:53:49.570113 | orchestrator | haproxy-config : Copying over barbican haproxy config ------------------- 4.17s 2026-04-20 00:53:49.570116 | orchestrator | haproxy-config : Copying over keystone haproxy config ------------------- 4.10s 2026-04-20 00:53:49.570120 | orchestrator | haproxy-config : Copying over cinder haproxy config --------------------- 3.94s 2026-04-20 00:53:49.570124 | orchestrator | loadbalancer : Copying over config.json files for services -------------- 3.88s 2026-04-20 00:53:49.570128 | orchestrator | haproxy-config : Copying over manila haproxy config --------------------- 3.83s 2026-04-20 00:53:49.570131 | orchestrator | haproxy-config : Copying over magnum haproxy config --------------------- 3.68s 2026-04-20 00:53:49.570135 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 3.68s 2026-04-20 00:53:49.570139 | orchestrator | loadbalancer : Copying over proxysql config ----------------------------- 3.65s 2026-04-20 00:53:49.570142 | orchestrator | haproxy-config : Copying over aodh haproxy config ----------------------- 3.55s 2026-04-20 00:53:49.570151 | orchestrator | haproxy-config : Add configuration for glance when using single external frontend --- 3.48s 2026-04-20 00:53:49.570154 | orchestrator | haproxy-config : Configuring firewall for mariadb ----------------------- 3.45s 2026-04-20 00:53:49.570158 | orchestrator | 2026-04-20 00:53:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:52.596266 | orchestrator | 2026-04-20 00:53:52 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:53:52.598655 | orchestrator | 2026-04-20 00:53:52 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:53:52.601157 | orchestrator | 2026-04-20 00:53:52 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:52.601265 | orchestrator | 2026-04-20 00:53:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:55.639662 | orchestrator | 2026-04-20 00:53:55 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:53:55.639714 | orchestrator | 2026-04-20 00:53:55 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:53:55.640968 | orchestrator | 2026-04-20 00:53:55 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:55.641308 | orchestrator | 2026-04-20 00:53:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:53:58.665772 | orchestrator | 2026-04-20 00:53:58 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:53:58.665882 | orchestrator | 2026-04-20 00:53:58 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:53:58.666929 | orchestrator | 2026-04-20 00:53:58 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:53:58.666955 | orchestrator | 2026-04-20 00:53:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:01.698340 | orchestrator | 2026-04-20 00:54:01 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:01.701813 | orchestrator | 2026-04-20 00:54:01 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:54:01.704082 | orchestrator | 2026-04-20 00:54:01 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:01.704151 | orchestrator | 2026-04-20 00:54:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:04.738318 | orchestrator | 2026-04-20 00:54:04 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:04.741468 | orchestrator | 2026-04-20 00:54:04 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:54:04.744421 | orchestrator | 2026-04-20 00:54:04 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:04.745912 | orchestrator | 2026-04-20 00:54:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:07.778129 | orchestrator | 2026-04-20 00:54:07 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:07.778795 | orchestrator | 2026-04-20 00:54:07 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state STARTED 2026-04-20 00:54:07.780689 | orchestrator | 2026-04-20 00:54:07 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:07.780721 | orchestrator | 2026-04-20 00:54:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:10.805532 | orchestrator | 2026-04-20 00:54:10 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:10.806509 | orchestrator | 2026-04-20 00:54:10 | INFO  | Task d5ae424c-8b69-46b4-bdab-8948d1da77c1 is in state SUCCESS 2026-04-20 00:54:10.807591 | orchestrator | 2026-04-20 00:54:10.807628 | orchestrator | 2026-04-20 00:54:10.807633 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:54:10.807637 | orchestrator | 2026-04-20 00:54:10.807641 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:54:10.807645 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:00.270) 0:00:00.270 ********** 2026-04-20 00:54:10.807648 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:54:10.807651 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:54:10.807655 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:54:10.807658 | orchestrator | 2026-04-20 00:54:10.807661 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:54:10.807664 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:00.275) 0:00:00.545 ********** 2026-04-20 00:54:10.807667 | orchestrator | ok: [testbed-node-0] => (item=enable_opensearch_True) 2026-04-20 00:54:10.807671 | orchestrator | ok: [testbed-node-1] => (item=enable_opensearch_True) 2026-04-20 00:54:10.807677 | orchestrator | ok: [testbed-node-2] => (item=enable_opensearch_True) 2026-04-20 00:54:10.807681 | orchestrator | 2026-04-20 00:54:10.807687 | orchestrator | PLAY [Apply role opensearch] *************************************************** 2026-04-20 00:54:10.807692 | orchestrator | 2026-04-20 00:54:10.807697 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-04-20 00:54:10.807702 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:00.239) 0:00:00.784 ********** 2026-04-20 00:54:10.807706 | orchestrator | included: /ansible/roles/opensearch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:54:10.807711 | orchestrator | 2026-04-20 00:54:10.807716 | orchestrator | TASK [opensearch : Setting sysctl values] ************************************** 2026-04-20 00:54:10.807720 | orchestrator | Monday 20 April 2026 00:53:50 +0000 (0:00:00.528) 0:00:01.313 ********** 2026-04-20 00:54:10.807725 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-20 00:54:10.807730 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-20 00:54:10.807742 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2026-04-20 00:54:10.807748 | orchestrator | 2026-04-20 00:54:10.807753 | orchestrator | TASK [opensearch : Ensuring config directories exist] ************************** 2026-04-20 00:54:10.807760 | orchestrator | Monday 20 April 2026 00:53:51 +0000 (0:00:00.969) 0:00:02.282 ********** 2026-04-20 00:54:10.807765 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.807770 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.807794 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.807802 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.807849 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.807855 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.807966 | orchestrator | 2026-04-20 00:54:10.807972 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-04-20 00:54:10.807976 | orchestrator | Monday 20 April 2026 00:53:52 +0000 (0:00:01.299) 0:00:03.582 ********** 2026-04-20 00:54:10.807983 | orchestrator | included: /ansible/roles/opensearch/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:54:10.807986 | orchestrator | 2026-04-20 00:54:10.807990 | orchestrator | TASK [service-cert-copy : opensearch | Copying over extra CA certificates] ***** 2026-04-20 00:54:10.807993 | orchestrator | Monday 20 April 2026 00:53:53 +0000 (0:00:00.479) 0:00:04.061 ********** 2026-04-20 00:54:10.807996 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808003 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808006 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808013 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808020 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808026 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808030 | orchestrator | 2026-04-20 00:54:10.808033 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS certificate] *** 2026-04-20 00:54:10.808037 | orchestrator | Monday 20 April 2026 00:53:55 +0000 (0:00:02.487) 0:00:06.549 ********** 2026-04-20 00:54:10.808040 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808049 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808052 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808057 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:10.808061 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808067 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:10.808070 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808077 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808081 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:10.808084 | orchestrator | 2026-04-20 00:54:10.808087 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS key] *** 2026-04-20 00:54:10.808090 | orchestrator | Monday 20 April 2026 00:53:56 +0000 (0:00:00.678) 0:00:07.227 ********** 2026-04-20 00:54:10.808095 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808099 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808104 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:10.808107 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808113 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808117 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:10.808121 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808125 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808132 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:10.808136 | orchestrator | 2026-04-20 00:54:10.808139 | orchestrator | TASK [opensearch : Copying over config.json files for services] **************** 2026-04-20 00:54:10.808142 | orchestrator | Monday 20 April 2026 00:53:57 +0000 (0:00:00.867) 0:00:08.095 ********** 2026-04-20 00:54:10.808148 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808154 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808164 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808175 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808186 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808195 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808202 | orchestrator | 2026-04-20 00:54:10.808206 | orchestrator | TASK [opensearch : Copying over opensearch service config file] **************** 2026-04-20 00:54:10.808209 | orchestrator | Monday 20 April 2026 00:53:59 +0000 (0:00:02.249) 0:00:10.344 ********** 2026-04-20 00:54:10.808215 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:10.808219 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:54:10.808222 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:54:10.808225 | orchestrator | 2026-04-20 00:54:10.808228 | orchestrator | TASK [opensearch : Copying over opensearch-dashboards config file] ************* 2026-04-20 00:54:10.808231 | orchestrator | Monday 20 April 2026 00:54:01 +0000 (0:00:02.240) 0:00:12.585 ********** 2026-04-20 00:54:10.808234 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:10.808238 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:54:10.808241 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:54:10.808244 | orchestrator | 2026-04-20 00:54:10.808247 | orchestrator | TASK [service-check-containers : opensearch | Check containers] **************** 2026-04-20 00:54:10.808250 | orchestrator | Monday 20 April 2026 00:54:03 +0000 (0:00:01.434) 0:00:14.020 ********** 2026-04-20 00:54:10.808253 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808256 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808263 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 00:54:10.808271 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808279 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808289 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}}) 2026-04-20 00:54:10.808294 | orchestrator | 2026-04-20 00:54:10.808300 | orchestrator | TASK [service-check-containers : opensearch | Notify handlers to restart containers] *** 2026-04-20 00:54:10.808305 | orchestrator | Monday 20 April 2026 00:54:05 +0000 (0:00:02.066) 0:00:16.086 ********** 2026-04-20 00:54:10.808310 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:54:10.808316 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:54:10.808321 | orchestrator | } 2026-04-20 00:54:10.808328 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:54:10.808331 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:54:10.808334 | orchestrator | } 2026-04-20 00:54:10.808337 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:54:10.808340 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:54:10.808344 | orchestrator | } 2026-04-20 00:54:10.808347 | orchestrator | 2026-04-20 00:54:10.808353 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:54:10.808356 | orchestrator | Monday 20 April 2026 00:54:05 +0000 (0:00:00.507) 0:00:16.594 ********** 2026-04-20 00:54:10.808359 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808375 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808378 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:10.808382 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808389 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808399 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:10.808480 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch:2.19.5.20260328', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal'], 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 00:54:10.808490 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/2024.2/opensearch-dashboards:2.19.5.20260328', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password', 'backend_http_extra': ['option httpchk GET /api/status']}}}})  2026-04-20 00:54:10.808495 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:10.808501 | orchestrator | 2026-04-20 00:54:10.808506 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2026-04-20 00:54:10.808512 | orchestrator | Monday 20 April 2026 00:54:06 +0000 (0:00:00.744) 0:00:17.338 ********** 2026-04-20 00:54:10.808517 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:10.808522 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:10.808528 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:10.808533 | orchestrator | 2026-04-20 00:54:10.808538 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-04-20 00:54:10.808544 | orchestrator | Monday 20 April 2026 00:54:06 +0000 (0:00:00.234) 0:00:17.572 ********** 2026-04-20 00:54:10.808549 | orchestrator | 2026-04-20 00:54:10.808555 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-04-20 00:54:10.808560 | orchestrator | Monday 20 April 2026 00:54:06 +0000 (0:00:00.057) 0:00:17.630 ********** 2026-04-20 00:54:10.808565 | orchestrator | 2026-04-20 00:54:10.808570 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2026-04-20 00:54:10.808576 | orchestrator | Monday 20 April 2026 00:54:06 +0000 (0:00:00.056) 0:00:17.687 ********** 2026-04-20 00:54:10.808581 | orchestrator | 2026-04-20 00:54:10.808587 | orchestrator | RUNNING HANDLER [opensearch : Disable shard allocation] ************************ 2026-04-20 00:54:10.808603 | orchestrator | Monday 20 April 2026 00:54:06 +0000 (0:00:00.058) 0:00:17.746 ********** 2026-04-20 00:54:10.808610 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:10.808615 | orchestrator | 2026-04-20 00:54:10.808621 | orchestrator | RUNNING HANDLER [opensearch : Perform a flush] ********************************* 2026-04-20 00:54:10.808627 | orchestrator | Monday 20 April 2026 00:54:07 +0000 (0:00:00.420) 0:00:18.166 ********** 2026-04-20 00:54:10.808632 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:10.808637 | orchestrator | 2026-04-20 00:54:10.808643 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch container] ******************** 2026-04-20 00:54:10.808649 | orchestrator | Monday 20 April 2026 00:54:07 +0000 (0:00:00.160) 0:00:18.326 ********** 2026-04-20 00:54:10.808659 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopensearch\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_yvq7dd36/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_yvq7dd36/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_yvq7dd36/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_yvq7dd36/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopensearch: Internal Server Error (\"unknown: repository kolla/release/2024.2/opensearch not found\")\\n'"} 2026-04-20 00:54:10.808671 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopensearch\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_0x43mlrp/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_0x43mlrp/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_0x43mlrp/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_0x43mlrp/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopensearch: Internal Server Error (\"unknown: repository kolla/release/2024.2/opensearch not found\")\\n'"} 2026-04-20 00:54:10.808685 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopensearch\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_a0sho737/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_a0sho737/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_a0sho737/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_a0sho737/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=2.19.5.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fopensearch: Internal Server Error (\"unknown: repository kolla/release/2024.2/opensearch not found\")\\n'"} 2026-04-20 00:54:10.808698 | orchestrator | 2026-04-20 00:54:10.808704 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:54:10.808714 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=1  skipped=6  rescued=0 ignored=0 2026-04-20 00:54:10.808720 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:54:10.808725 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 00:54:10.808731 | orchestrator | 2026-04-20 00:54:10.808736 | orchestrator | 2026-04-20 00:54:10.808742 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:54:10.808748 | orchestrator | Monday 20 April 2026 00:54:10 +0000 (0:00:03.065) 0:00:21.392 ********** 2026-04-20 00:54:10.808754 | orchestrator | =============================================================================== 2026-04-20 00:54:10.808760 | orchestrator | opensearch : Restart opensearch container ------------------------------- 3.07s 2026-04-20 00:54:10.808765 | orchestrator | service-cert-copy : opensearch | Copying over extra CA certificates ----- 2.49s 2026-04-20 00:54:10.808771 | orchestrator | opensearch : Copying over config.json files for services ---------------- 2.25s 2026-04-20 00:54:10.808777 | orchestrator | opensearch : Copying over opensearch service config file ---------------- 2.24s 2026-04-20 00:54:10.808782 | orchestrator | service-check-containers : opensearch | Check containers ---------------- 2.07s 2026-04-20 00:54:10.808787 | orchestrator | opensearch : Copying over opensearch-dashboards config file ------------- 1.43s 2026-04-20 00:54:10.808793 | orchestrator | opensearch : Ensuring config directories exist -------------------------- 1.30s 2026-04-20 00:54:10.808799 | orchestrator | opensearch : Setting sysctl values -------------------------------------- 0.97s 2026-04-20 00:54:10.808805 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS key --- 0.87s 2026-04-20 00:54:10.808810 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.74s 2026-04-20 00:54:10.808819 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS certificate --- 0.68s 2026-04-20 00:54:10.808825 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.53s 2026-04-20 00:54:10.808831 | orchestrator | service-check-containers : opensearch | Notify handlers to restart containers --- 0.51s 2026-04-20 00:54:10.808837 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.48s 2026-04-20 00:54:10.808842 | orchestrator | opensearch : Disable shard allocation ----------------------------------- 0.42s 2026-04-20 00:54:10.808848 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.28s 2026-04-20 00:54:10.808854 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.24s 2026-04-20 00:54:10.808859 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.23s 2026-04-20 00:54:10.808865 | orchestrator | opensearch : Flush handlers --------------------------------------------- 0.17s 2026-04-20 00:54:10.808870 | orchestrator | opensearch : Perform a flush -------------------------------------------- 0.16s 2026-04-20 00:54:10.808876 | orchestrator | 2026-04-20 00:54:10 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:10.808881 | orchestrator | 2026-04-20 00:54:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:13.840759 | orchestrator | 2026-04-20 00:54:13 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:13.843189 | orchestrator | 2026-04-20 00:54:13 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:13.844760 | orchestrator | 2026-04-20 00:54:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:16.896221 | orchestrator | 2026-04-20 00:54:16 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:16.897988 | orchestrator | 2026-04-20 00:54:16 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:16.898055 | orchestrator | 2026-04-20 00:54:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:19.943651 | orchestrator | 2026-04-20 00:54:19 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:19.945128 | orchestrator | 2026-04-20 00:54:19 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:19.945175 | orchestrator | 2026-04-20 00:54:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:22.984845 | orchestrator | 2026-04-20 00:54:22 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:22.985708 | orchestrator | 2026-04-20 00:54:22 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:22.985729 | orchestrator | 2026-04-20 00:54:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:26.016279 | orchestrator | 2026-04-20 00:54:26 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:26.016902 | orchestrator | 2026-04-20 00:54:26 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:26.016949 | orchestrator | 2026-04-20 00:54:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:29.055981 | orchestrator | 2026-04-20 00:54:29 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:29.058223 | orchestrator | 2026-04-20 00:54:29 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:29.058624 | orchestrator | 2026-04-20 00:54:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:32.100773 | orchestrator | 2026-04-20 00:54:32 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:32.102700 | orchestrator | 2026-04-20 00:54:32 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:32.102767 | orchestrator | 2026-04-20 00:54:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:35.148571 | orchestrator | 2026-04-20 00:54:35 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:35.150305 | orchestrator | 2026-04-20 00:54:35 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:35.150363 | orchestrator | 2026-04-20 00:54:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:38.184851 | orchestrator | 2026-04-20 00:54:38 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:38.185921 | orchestrator | 2026-04-20 00:54:38 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:38.185976 | orchestrator | 2026-04-20 00:54:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:41.225358 | orchestrator | 2026-04-20 00:54:41 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:41.226981 | orchestrator | 2026-04-20 00:54:41 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:41.227022 | orchestrator | 2026-04-20 00:54:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:44.282183 | orchestrator | 2026-04-20 00:54:44 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:44.283121 | orchestrator | 2026-04-20 00:54:44 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:44.283156 | orchestrator | 2026-04-20 00:54:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:47.333458 | orchestrator | 2026-04-20 00:54:47 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:47.335415 | orchestrator | 2026-04-20 00:54:47 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:47.335487 | orchestrator | 2026-04-20 00:54:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:50.384227 | orchestrator | 2026-04-20 00:54:50 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:50.385972 | orchestrator | 2026-04-20 00:54:50 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:50.386068 | orchestrator | 2026-04-20 00:54:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:53.446667 | orchestrator | 2026-04-20 00:54:53 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:53.449198 | orchestrator | 2026-04-20 00:54:53 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:53.449245 | orchestrator | 2026-04-20 00:54:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:56.500534 | orchestrator | 2026-04-20 00:54:56 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state STARTED 2026-04-20 00:54:56.502197 | orchestrator | 2026-04-20 00:54:56 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:56.502496 | orchestrator | 2026-04-20 00:54:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:54:59.542749 | orchestrator | 2026-04-20 00:54:59 | INFO  | Task e7f09b97-4413-48e5-82eb-682ca07fb073 is in state SUCCESS 2026-04-20 00:54:59.543937 | orchestrator | 2026-04-20 00:54:59.543967 | orchestrator | 2026-04-20 00:54:59.543974 | orchestrator | PLAY [Set kolla_action_mariadb] ************************************************ 2026-04-20 00:54:59.543979 | orchestrator | 2026-04-20 00:54:59.543984 | orchestrator | TASK [Inform the user about the following task] ******************************** 2026-04-20 00:54:59.543990 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:00.089) 0:00:00.089 ********** 2026-04-20 00:54:59.543994 | orchestrator | ok: [localhost] => { 2026-04-20 00:54:59.544000 | orchestrator |  "msg": "The task 'Check MariaDB service' fails if the MariaDB service has not yet been deployed. This is fine." 2026-04-20 00:54:59.544005 | orchestrator | } 2026-04-20 00:54:59.544010 | orchestrator | 2026-04-20 00:54:59.544015 | orchestrator | TASK [Check MariaDB service] *************************************************** 2026-04-20 00:54:59.544020 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:00.043) 0:00:00.133 ********** 2026-04-20 00:54:59.544025 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.9:3306"} 2026-04-20 00:54:59.544031 | orchestrator | ...ignoring 2026-04-20 00:54:59.544036 | orchestrator | 2026-04-20 00:54:59.544042 | orchestrator | TASK [Set kolla_action_mariadb = upgrade if MariaDB is already running] ******** 2026-04-20 00:54:59.544047 | orchestrator | Monday 20 April 2026 00:53:52 +0000 (0:00:02.858) 0:00:02.992 ********** 2026-04-20 00:54:59.544052 | orchestrator | skipping: [localhost] 2026-04-20 00:54:59.544057 | orchestrator | 2026-04-20 00:54:59.544063 | orchestrator | TASK [Set kolla_action_mariadb = kolla_action_ng] ****************************** 2026-04-20 00:54:59.544068 | orchestrator | Monday 20 April 2026 00:53:52 +0000 (0:00:00.057) 0:00:03.049 ********** 2026-04-20 00:54:59.544074 | orchestrator | ok: [localhost] 2026-04-20 00:54:59.544080 | orchestrator | 2026-04-20 00:54:59.544086 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:54:59.544091 | orchestrator | 2026-04-20 00:54:59.544097 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:54:59.544119 | orchestrator | Monday 20 April 2026 00:53:52 +0000 (0:00:00.186) 0:00:03.235 ********** 2026-04-20 00:54:59.544125 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:54:59.544130 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:54:59.544135 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:54:59.544140 | orchestrator | 2026-04-20 00:54:59.544145 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:54:59.544150 | orchestrator | Monday 20 April 2026 00:53:52 +0000 (0:00:00.270) 0:00:03.505 ********** 2026-04-20 00:54:59.544156 | orchestrator | ok: [testbed-node-0] => (item=enable_mariadb_True) 2026-04-20 00:54:59.544275 | orchestrator | ok: [testbed-node-1] => (item=enable_mariadb_True) 2026-04-20 00:54:59.544284 | orchestrator | ok: [testbed-node-2] => (item=enable_mariadb_True) 2026-04-20 00:54:59.544290 | orchestrator | 2026-04-20 00:54:59.544295 | orchestrator | PLAY [Apply role mariadb] ****************************************************** 2026-04-20 00:54:59.544301 | orchestrator | 2026-04-20 00:54:59.544307 | orchestrator | TASK [mariadb : Group MariaDB hosts based on shards] *************************** 2026-04-20 00:54:59.544320 | orchestrator | Monday 20 April 2026 00:53:53 +0000 (0:00:00.344) 0:00:03.850 ********** 2026-04-20 00:54:59.544326 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-20 00:54:59.544332 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2026-04-20 00:54:59.544337 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2026-04-20 00:54:59.544342 | orchestrator | 2026-04-20 00:54:59.544408 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-20 00:54:59.544563 | orchestrator | Monday 20 April 2026 00:53:53 +0000 (0:00:00.314) 0:00:04.164 ********** 2026-04-20 00:54:59.544575 | orchestrator | included: /ansible/roles/mariadb/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:54:59.544581 | orchestrator | 2026-04-20 00:54:59.544586 | orchestrator | TASK [mariadb : Ensuring config directories exist] ***************************** 2026-04-20 00:54:59.544592 | orchestrator | Monday 20 April 2026 00:53:53 +0000 (0:00:00.502) 0:00:04.667 ********** 2026-04-20 00:54:59.544614 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544625 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544642 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544646 | orchestrator | 2026-04-20 00:54:59.544649 | orchestrator | TASK [mariadb : Ensuring database backup config directory exists] ************** 2026-04-20 00:54:59.544652 | orchestrator | Monday 20 April 2026 00:53:56 +0000 (0:00:02.441) 0:00:07.108 ********** 2026-04-20 00:54:59.544655 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.544659 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.544665 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:59.544668 | orchestrator | 2026-04-20 00:54:59.544671 | orchestrator | TASK [mariadb : Copying over my.cnf for mariabackup] *************************** 2026-04-20 00:54:59.544674 | orchestrator | Monday 20 April 2026 00:53:56 +0000 (0:00:00.504) 0:00:07.613 ********** 2026-04-20 00:54:59.544677 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.544680 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.544683 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:59.544686 | orchestrator | 2026-04-20 00:54:59.544689 | orchestrator | TASK [mariadb : Copying over config.json files for services] ******************* 2026-04-20 00:54:59.544693 | orchestrator | Monday 20 April 2026 00:53:58 +0000 (0:00:01.238) 0:00:08.851 ********** 2026-04-20 00:54:59.544698 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544705 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544713 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544717 | orchestrator | 2026-04-20 00:54:59.544720 | orchestrator | TASK [mariadb : Copying over config.json files for mariabackup] **************** 2026-04-20 00:54:59.544723 | orchestrator | Monday 20 April 2026 00:54:01 +0000 (0:00:03.165) 0:00:12.017 ********** 2026-04-20 00:54:59.544727 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.544730 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.544733 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:59.544736 | orchestrator | 2026-04-20 00:54:59.544739 | orchestrator | TASK [mariadb : Copying over galera.cnf] *************************************** 2026-04-20 00:54:59.544742 | orchestrator | Monday 20 April 2026 00:54:02 +0000 (0:00:00.969) 0:00:12.986 ********** 2026-04-20 00:54:59.544745 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:59.544748 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:54:59.544751 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:54:59.544754 | orchestrator | 2026-04-20 00:54:59.544757 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-20 00:54:59.544760 | orchestrator | Monday 20 April 2026 00:54:05 +0000 (0:00:03.579) 0:00:16.566 ********** 2026-04-20 00:54:59.544763 | orchestrator | included: /ansible/roles/mariadb/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:54:59.544766 | orchestrator | 2026-04-20 00:54:59.544769 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2026-04-20 00:54:59.544772 | orchestrator | Monday 20 April 2026 00:54:06 +0000 (0:00:00.506) 0:00:17.072 ********** 2026-04-20 00:54:59.544780 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544786 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.544796 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544799 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.544806 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544811 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.544814 | orchestrator | 2026-04-20 00:54:59.544818 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2026-04-20 00:54:59.544821 | orchestrator | Monday 20 April 2026 00:54:08 +0000 (0:00:02.468) 0:00:19.541 ********** 2026-04-20 00:54:59.544826 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544830 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.544836 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544841 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.544846 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544850 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.544853 | orchestrator | 2026-04-20 00:54:59.544856 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2026-04-20 00:54:59.544859 | orchestrator | Monday 20 April 2026 00:54:11 +0000 (0:00:02.266) 0:00:21.808 ********** 2026-04-20 00:54:59.544864 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544870 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.544875 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544878 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.544882 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.544889 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.544892 | orchestrator | 2026-04-20 00:54:59.544897 | orchestrator | TASK [service-check-containers : mariadb | Check containers] ******************* 2026-04-20 00:54:59.544900 | orchestrator | Monday 20 April 2026 00:54:13 +0000 (0:00:02.211) 0:00:24.019 ********** 2026-04-20 00:54:59.544903 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544909 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544918 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2026-04-20 00:54:59.544921 | orchestrator | 2026-04-20 00:54:59.544924 | orchestrator | TASK [service-check-containers : mariadb | Notify handlers to restart containers] *** 2026-04-20 00:54:59.544928 | orchestrator | Monday 20 April 2026 00:54:15 +0000 (0:00:02.536) 0:00:26.555 ********** 2026-04-20 00:54:59.544931 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:54:59.544934 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:54:59.544937 | orchestrator | } 2026-04-20 00:54:59.544940 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:54:59.544943 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:54:59.544946 | orchestrator | } 2026-04-20 00:54:59.544951 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:54:59.544955 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:54:59.544958 | orchestrator | } 2026-04-20 00:54:59.544961 | orchestrator | 2026-04-20 00:54:59.544986 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:54:59.544990 | orchestrator | Monday 20 April 2026 00:54:16 +0000 (0:00:00.333) 0:00:26.888 ********** 2026-04-20 00:54:59.544996 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545000 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545006 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545010 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545015 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545021 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545024 | orchestrator | 2026-04-20 00:54:59.545027 | orchestrator | TASK [mariadb : Checking for mariadb cluster] ********************************** 2026-04-20 00:54:59.545030 | orchestrator | Monday 20 April 2026 00:54:18 +0000 (0:00:02.064) 0:00:28.953 ********** 2026-04-20 00:54:59.545033 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545036 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545039 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545042 | orchestrator | 2026-04-20 00:54:59.545045 | orchestrator | TASK [mariadb : Cleaning up temp file on localhost] **************************** 2026-04-20 00:54:59.545050 | orchestrator | Monday 20 April 2026 00:54:18 +0000 (0:00:00.468) 0:00:29.421 ********** 2026-04-20 00:54:59.545053 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545056 | orchestrator | 2026-04-20 00:54:59.545059 | orchestrator | TASK [mariadb : Stop MariaDB containers] *************************************** 2026-04-20 00:54:59.545063 | orchestrator | Monday 20 April 2026 00:54:18 +0000 (0:00:00.104) 0:00:29.525 ********** 2026-04-20 00:54:59.545066 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545069 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545072 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545075 | orchestrator | 2026-04-20 00:54:59.545078 | orchestrator | TASK [mariadb : Run MariaDB wsrep recovery] ************************************ 2026-04-20 00:54:59.545081 | orchestrator | Monday 20 April 2026 00:54:19 +0000 (0:00:00.382) 0:00:29.908 ********** 2026-04-20 00:54:59.545084 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545087 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545090 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545093 | orchestrator | 2026-04-20 00:54:59.545096 | orchestrator | TASK [mariadb : Copying MariaDB log file to /tmp] ****************************** 2026-04-20 00:54:59.545100 | orchestrator | Monday 20 April 2026 00:54:19 +0000 (0:00:00.313) 0:00:30.221 ********** 2026-04-20 00:54:59.545103 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545106 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545109 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545112 | orchestrator | 2026-04-20 00:54:59.545115 | orchestrator | TASK [mariadb : Get MariaDB wsrep recovery seqno] ****************************** 2026-04-20 00:54:59.545118 | orchestrator | Monday 20 April 2026 00:54:19 +0000 (0:00:00.286) 0:00:30.508 ********** 2026-04-20 00:54:59.545123 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545126 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545129 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545132 | orchestrator | 2026-04-20 00:54:59.545136 | orchestrator | TASK [mariadb : Removing MariaDB log file from /tmp] *************************** 2026-04-20 00:54:59.545139 | orchestrator | Monday 20 April 2026 00:54:20 +0000 (0:00:00.408) 0:00:30.917 ********** 2026-04-20 00:54:59.545142 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545145 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545148 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545151 | orchestrator | 2026-04-20 00:54:59.545154 | orchestrator | TASK [mariadb : Registering MariaDB seqno variable] **************************** 2026-04-20 00:54:59.545157 | orchestrator | Monday 20 April 2026 00:54:20 +0000 (0:00:00.274) 0:00:31.191 ********** 2026-04-20 00:54:59.545160 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545163 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545166 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545169 | orchestrator | 2026-04-20 00:54:59.545173 | orchestrator | TASK [mariadb : Comparing seqno value on all mariadb hosts] ******************** 2026-04-20 00:54:59.545176 | orchestrator | Monday 20 April 2026 00:54:20 +0000 (0:00:00.266) 0:00:31.458 ********** 2026-04-20 00:54:59.545179 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:54:59.545182 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:54:59.545188 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:54:59.545202 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545206 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-04-20 00:54:59.545209 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-04-20 00:54:59.545213 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-04-20 00:54:59.545217 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545220 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-04-20 00:54:59.545224 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-04-20 00:54:59.545227 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-04-20 00:54:59.545231 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545235 | orchestrator | 2026-04-20 00:54:59.545238 | orchestrator | TASK [mariadb : Writing hostname of host with the largest seqno to temp file] *** 2026-04-20 00:54:59.545242 | orchestrator | Monday 20 April 2026 00:54:21 +0000 (0:00:00.308) 0:00:31.766 ********** 2026-04-20 00:54:59.545245 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545249 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545252 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545256 | orchestrator | 2026-04-20 00:54:59.545260 | orchestrator | TASK [mariadb : Registering mariadb_recover_inventory_name from temp file] ***** 2026-04-20 00:54:59.545263 | orchestrator | Monday 20 April 2026 00:54:21 +0000 (0:00:00.427) 0:00:32.193 ********** 2026-04-20 00:54:59.545267 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545270 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545274 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545277 | orchestrator | 2026-04-20 00:54:59.545281 | orchestrator | TASK [mariadb : Store bootstrap and master hostnames into facts] *************** 2026-04-20 00:54:59.545284 | orchestrator | Monday 20 April 2026 00:54:21 +0000 (0:00:00.300) 0:00:32.494 ********** 2026-04-20 00:54:59.545288 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545291 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545295 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545299 | orchestrator | 2026-04-20 00:54:59.545302 | orchestrator | TASK [mariadb : Set grastate.dat file from MariaDB container in bootstrap host] *** 2026-04-20 00:54:59.545306 | orchestrator | Monday 20 April 2026 00:54:22 +0000 (0:00:00.309) 0:00:32.804 ********** 2026-04-20 00:54:59.545309 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545313 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545318 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545322 | orchestrator | 2026-04-20 00:54:59.545326 | orchestrator | TASK [mariadb : Starting first MariaDB container] ****************************** 2026-04-20 00:54:59.545329 | orchestrator | Monday 20 April 2026 00:54:22 +0000 (0:00:00.289) 0:00:33.093 ********** 2026-04-20 00:54:59.545333 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545336 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545340 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545343 | orchestrator | 2026-04-20 00:54:59.545347 | orchestrator | TASK [mariadb : Wait for first MariaDB container] ****************************** 2026-04-20 00:54:59.545392 | orchestrator | Monday 20 April 2026 00:54:22 +0000 (0:00:00.379) 0:00:33.473 ********** 2026-04-20 00:54:59.545397 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545401 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545404 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545407 | orchestrator | 2026-04-20 00:54:59.545411 | orchestrator | TASK [mariadb : Set first MariaDB container as primary] ************************ 2026-04-20 00:54:59.545414 | orchestrator | Monday 20 April 2026 00:54:22 +0000 (0:00:00.257) 0:00:33.730 ********** 2026-04-20 00:54:59.545418 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545422 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545425 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545428 | orchestrator | 2026-04-20 00:54:59.545432 | orchestrator | TASK [mariadb : Wait for MariaDB to become operational] ************************ 2026-04-20 00:54:59.545436 | orchestrator | Monday 20 April 2026 00:54:23 +0000 (0:00:00.273) 0:00:34.004 ********** 2026-04-20 00:54:59.545439 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545443 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545446 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545450 | orchestrator | 2026-04-20 00:54:59.545453 | orchestrator | TASK [mariadb : Restart slave MariaDB container(s)] **************************** 2026-04-20 00:54:59.545457 | orchestrator | Monday 20 April 2026 00:54:23 +0000 (0:00:00.286) 0:00:34.291 ********** 2026-04-20 00:54:59.545463 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545467 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545476 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545480 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545486 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545490 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545494 | orchestrator | 2026-04-20 00:54:59.545497 | orchestrator | TASK [mariadb : Wait for slave MariaDB] **************************************** 2026-04-20 00:54:59.545501 | orchestrator | Monday 20 April 2026 00:54:25 +0000 (0:00:01.783) 0:00:36.075 ********** 2026-04-20 00:54:59.545507 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545510 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545514 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545518 | orchestrator | 2026-04-20 00:54:59.545521 | orchestrator | TASK [mariadb : Restart master MariaDB container(s)] *************************** 2026-04-20 00:54:59.545525 | orchestrator | Monday 20 April 2026 00:54:25 +0000 (0:00:00.367) 0:00:36.442 ********** 2026-04-20 00:54:59.545532 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545536 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545542 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545549 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545556 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/mariadb-server:10.11.16.20260328', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2026-04-20 00:54:59.545560 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545563 | orchestrator | 2026-04-20 00:54:59.545567 | orchestrator | TASK [mariadb : Wait for master mariadb] *************************************** 2026-04-20 00:54:59.545570 | orchestrator | Monday 20 April 2026 00:54:27 +0000 (0:00:01.884) 0:00:38.327 ********** 2026-04-20 00:54:59.545600 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545608 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545613 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545617 | orchestrator | 2026-04-20 00:54:59.545621 | orchestrator | TASK [service-check : mariadb | Get container facts] *************************** 2026-04-20 00:54:59.545626 | orchestrator | Monday 20 April 2026 00:54:27 +0000 (0:00:00.261) 0:00:38.589 ********** 2026-04-20 00:54:59.545630 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545634 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545639 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545644 | orchestrator | 2026-04-20 00:54:59.545648 | orchestrator | TASK [service-check : mariadb | Fail if containers are missing or not running] *** 2026-04-20 00:54:59.545653 | orchestrator | Monday 20 April 2026 00:54:28 +0000 (0:00:00.269) 0:00:38.858 ********** 2026-04-20 00:54:59.545658 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545662 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545666 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545671 | orchestrator | 2026-04-20 00:54:59.545675 | orchestrator | TASK [service-check : mariadb | Fail if containers are unhealthy] ************** 2026-04-20 00:54:59.545680 | orchestrator | Monday 20 April 2026 00:54:28 +0000 (0:00:00.399) 0:00:39.258 ********** 2026-04-20 00:54:59.545684 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545689 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545694 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545698 | orchestrator | 2026-04-20 00:54:59.545703 | orchestrator | TASK [mariadb : Wait for MariaDB service to be ready through VIP] ************** 2026-04-20 00:54:59.545712 | orchestrator | Monday 20 April 2026 00:54:28 +0000 (0:00:00.445) 0:00:39.703 ********** 2026-04-20 00:54:59.545716 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545720 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545725 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545729 | orchestrator | 2026-04-20 00:54:59.545734 | orchestrator | TASK [mariadb : Create MariaDB volume] ***************************************** 2026-04-20 00:54:59.545739 | orchestrator | Monday 20 April 2026 00:54:29 +0000 (0:00:00.256) 0:00:39.960 ********** 2026-04-20 00:54:59.545744 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:54:59.545749 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:54:59.545753 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:54:59.545758 | orchestrator | 2026-04-20 00:54:59.545765 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB volume availability] ************* 2026-04-20 00:54:59.545770 | orchestrator | Monday 20 April 2026 00:54:30 +0000 (0:00:00.923) 0:00:40.883 ********** 2026-04-20 00:54:59.545775 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:54:59.545780 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:54:59.545784 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:54:59.545789 | orchestrator | 2026-04-20 00:54:59.545794 | orchestrator | TASK [mariadb : Establish whether the cluster has already existed] ************* 2026-04-20 00:54:59.545799 | orchestrator | Monday 20 April 2026 00:54:30 +0000 (0:00:00.262) 0:00:41.146 ********** 2026-04-20 00:54:59.545805 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:54:59.545809 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:54:59.545812 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:54:59.545815 | orchestrator | 2026-04-20 00:54:59.545818 | orchestrator | TASK [mariadb : Check MariaDB service port liveness] *************************** 2026-04-20 00:54:59.545822 | orchestrator | Monday 20 April 2026 00:54:30 +0000 (0:00:00.271) 0:00:41.418 ********** 2026-04-20 00:54:59.545826 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.10:3306"} 2026-04-20 00:54:59.545830 | orchestrator | ...ignoring 2026-04-20 00:54:59.545833 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.11:3306"} 2026-04-20 00:54:59.545836 | orchestrator | ...ignoring 2026-04-20 00:54:59.545839 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.12:3306"} 2026-04-20 00:54:59.545843 | orchestrator | ...ignoring 2026-04-20 00:54:59.545846 | orchestrator | 2026-04-20 00:54:59.545849 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service port liveness] *********** 2026-04-20 00:54:59.545852 | orchestrator | Monday 20 April 2026 00:54:41 +0000 (0:00:10.694) 0:00:52.113 ********** 2026-04-20 00:54:59.545855 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:54:59.545858 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:54:59.545861 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:54:59.545864 | orchestrator | 2026-04-20 00:54:59.545881 | orchestrator | TASK [mariadb : Fail on existing but stopped cluster] ************************** 2026-04-20 00:54:59.545885 | orchestrator | Monday 20 April 2026 00:54:41 +0000 (0:00:00.394) 0:00:52.507 ********** 2026-04-20 00:54:59.545888 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545891 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545894 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545897 | orchestrator | 2026-04-20 00:54:59.545900 | orchestrator | TASK [mariadb : Check MariaDB service WSREP sync status] *********************** 2026-04-20 00:54:59.545903 | orchestrator | Monday 20 April 2026 00:54:42 +0000 (0:00:00.264) 0:00:52.772 ********** 2026-04-20 00:54:59.545906 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545909 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545913 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545916 | orchestrator | 2026-04-20 00:54:59.545922 | orchestrator | TASK [mariadb : Extract MariaDB service WSREP sync status] ********************* 2026-04-20 00:54:59.545928 | orchestrator | Monday 20 April 2026 00:54:42 +0000 (0:00:00.361) 0:00:53.133 ********** 2026-04-20 00:54:59.545931 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545934 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545937 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545940 | orchestrator | 2026-04-20 00:54:59.545943 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service WSREP sync status] ******* 2026-04-20 00:54:59.545946 | orchestrator | Monday 20 April 2026 00:54:42 +0000 (0:00:00.329) 0:00:53.463 ********** 2026-04-20 00:54:59.545950 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:54:59.545953 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:54:59.545956 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:54:59.545959 | orchestrator | 2026-04-20 00:54:59.545962 | orchestrator | TASK [mariadb : Fail when MariaDB services are not synced across the whole cluster] *** 2026-04-20 00:54:59.545965 | orchestrator | Monday 20 April 2026 00:54:43 +0000 (0:00:00.302) 0:00:53.765 ********** 2026-04-20 00:54:59.545968 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:54:59.545971 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545974 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545977 | orchestrator | 2026-04-20 00:54:59.545980 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-20 00:54:59.545983 | orchestrator | Monday 20 April 2026 00:54:43 +0000 (0:00:00.496) 0:00:54.262 ********** 2026-04-20 00:54:59.545986 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.545989 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.545992 | orchestrator | included: /ansible/roles/mariadb/tasks/bootstrap_cluster.yml for testbed-node-0 2026-04-20 00:54:59.545995 | orchestrator | 2026-04-20 00:54:59.545998 | orchestrator | TASK [mariadb : Running MariaDB bootstrap container] *************************** 2026-04-20 00:54:59.546001 | orchestrator | Monday 20 April 2026 00:54:43 +0000 (0:00:00.344) 0:00:54.607 ********** 2026-04-20 00:54:59.546008 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmariadb-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_crye5lao/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_crye5lao/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_crye5lao/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmariadb-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/mariadb-server not found\")\\n'"} 2026-04-20 00:54:59.546054 | orchestrator | 2026-04-20 00:54:59.546059 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2026-04-20 00:54:59.546062 | orchestrator | Monday 20 April 2026 00:54:47 +0000 (0:00:04.057) 0:00:58.664 ********** 2026-04-20 00:54:59.546065 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.546071 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.546076 | orchestrator | 2026-04-20 00:54:59.546087 | orchestrator | RUNNING HANDLER [mariadb : Restart MariaDB on existing cluster members] ******** 2026-04-20 00:54:59.546093 | orchestrator | Monday 20 April 2026 00:54:48 +0000 (0:00:00.610) 0:00:59.275 ********** 2026-04-20 00:54:59.546097 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:54:59.546102 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:54:59.546107 | orchestrator | 2026-04-20 00:54:59.546116 | orchestrator | RUNNING HANDLER [mariadb : Start MariaDB on new nodes] ************************* 2026-04-20 00:54:59.546121 | orchestrator | Monday 20 April 2026 00:54:48 +0000 (0:00:00.211) 0:00:59.487 ********** 2026-04-20 00:54:59.546126 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:54:59.546130 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:54:59.546136 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: mariadb_restart 2026-04-20 00:54:59.546141 | orchestrator | 2026-04-20 00:54:59.546146 | orchestrator | PLAY [Restart mariadb services] ************************************************ 2026-04-20 00:54:59.546152 | orchestrator | skipping: no hosts matched 2026-04-20 00:54:59.546157 | orchestrator | 2026-04-20 00:54:59.546162 | orchestrator | PLAY [Start mariadb services] ************************************************** 2026-04-20 00:54:59.546167 | orchestrator | 2026-04-20 00:54:59.546173 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2026-04-20 00:54:59.546177 | orchestrator | Monday 20 April 2026 00:54:48 +0000 (0:00:00.240) 0:00:59.727 ********** 2026-04-20 00:54:59.546186 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmariadb-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_nya4x1d3/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_nya4x1d3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_nya4x1d3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_nya4x1d3/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=10.11.16.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fmariadb-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/mariadb-server not found\")\\n'"} 2026-04-20 00:54:59.546196 | orchestrator | 2026-04-20 00:54:59.546201 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:54:59.546207 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2026-04-20 00:54:59.546213 | orchestrator | testbed-node-0 : ok=20  changed=9  unreachable=0 failed=1  skipped=33  rescued=0 ignored=1  2026-04-20 00:54:59.546219 | orchestrator | testbed-node-1 : ok=16  changed=7  unreachable=0 failed=1  skipped=38  rescued=0 ignored=1  2026-04-20 00:54:59.546224 | orchestrator | testbed-node-2 : ok=16  changed=7  unreachable=0 failed=0 skipped=38  rescued=0 ignored=1  2026-04-20 00:54:59.546230 | orchestrator | 2026-04-20 00:54:59.546235 | orchestrator | 2026-04-20 00:54:59.546240 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:54:59.546245 | orchestrator | Monday 20 April 2026 00:54:57 +0000 (0:00:08.557) 0:01:08.284 ********** 2026-04-20 00:54:59.546253 | orchestrator | =============================================================================== 2026-04-20 00:54:59.546259 | orchestrator | mariadb : Check MariaDB service port liveness -------------------------- 10.69s 2026-04-20 00:54:59.546311 | orchestrator | mariadb : Restart MariaDB container ------------------------------------- 8.56s 2026-04-20 00:54:59.546317 | orchestrator | mariadb : Running MariaDB bootstrap container --------------------------- 4.06s 2026-04-20 00:54:59.546322 | orchestrator | mariadb : Copying over galera.cnf --------------------------------------- 3.58s 2026-04-20 00:54:59.546328 | orchestrator | mariadb : Copying over config.json files for services ------------------- 3.17s 2026-04-20 00:54:59.546331 | orchestrator | Check MariaDB service --------------------------------------------------- 2.86s 2026-04-20 00:54:59.546334 | orchestrator | service-check-containers : mariadb | Check containers ------------------- 2.54s 2026-04-20 00:54:59.546338 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 2.47s 2026-04-20 00:54:59.546341 | orchestrator | mariadb : Ensuring config directories exist ----------------------------- 2.44s 2026-04-20 00:54:59.546344 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS certificate --- 2.27s 2026-04-20 00:54:59.546347 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS key ----- 2.21s 2026-04-20 00:54:59.546350 | orchestrator | service-check-containers : Include tasks -------------------------------- 2.07s 2026-04-20 00:54:59.546372 | orchestrator | mariadb : Restart master MariaDB container(s) --------------------------- 1.88s 2026-04-20 00:54:59.546377 | orchestrator | mariadb : Restart slave MariaDB container(s) ---------------------------- 1.78s 2026-04-20 00:54:59.546380 | orchestrator | mariadb : Copying over my.cnf for mariabackup --------------------------- 1.24s 2026-04-20 00:54:59.546383 | orchestrator | mariadb : Copying over config.json files for mariabackup ---------------- 0.97s 2026-04-20 00:54:59.546386 | orchestrator | mariadb : Create MariaDB volume ----------------------------------------- 0.92s 2026-04-20 00:54:59.546389 | orchestrator | mariadb : include_tasks ------------------------------------------------- 0.61s 2026-04-20 00:54:59.546392 | orchestrator | mariadb : include_tasks ------------------------------------------------- 0.51s 2026-04-20 00:54:59.546395 | orchestrator | mariadb : Ensuring database backup config directory exists -------------- 0.50s 2026-04-20 00:54:59.546407 | orchestrator | 2026-04-20 00:54:59 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:54:59.548180 | orchestrator | 2026-04-20 00:54:59 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:54:59.549855 | orchestrator | 2026-04-20 00:54:59 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:54:59.550183 | orchestrator | 2026-04-20 00:54:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:02.591469 | orchestrator | 2026-04-20 00:55:02 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:02.592283 | orchestrator | 2026-04-20 00:55:02 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:02.593216 | orchestrator | 2026-04-20 00:55:02 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:02.593252 | orchestrator | 2026-04-20 00:55:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:05.633933 | orchestrator | 2026-04-20 00:55:05 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:05.635166 | orchestrator | 2026-04-20 00:55:05 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:05.636459 | orchestrator | 2026-04-20 00:55:05 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:05.636496 | orchestrator | 2026-04-20 00:55:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:08.680840 | orchestrator | 2026-04-20 00:55:08 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:08.681690 | orchestrator | 2026-04-20 00:55:08 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:08.682618 | orchestrator | 2026-04-20 00:55:08 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:08.682660 | orchestrator | 2026-04-20 00:55:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:11.757514 | orchestrator | 2026-04-20 00:55:11 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:11.758197 | orchestrator | 2026-04-20 00:55:11 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:11.759399 | orchestrator | 2026-04-20 00:55:11 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:11.759438 | orchestrator | 2026-04-20 00:55:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:14.791873 | orchestrator | 2026-04-20 00:55:14 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:14.791924 | orchestrator | 2026-04-20 00:55:14 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:14.792909 | orchestrator | 2026-04-20 00:55:14 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:14.793011 | orchestrator | 2026-04-20 00:55:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:17.827243 | orchestrator | 2026-04-20 00:55:17 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:17.827368 | orchestrator | 2026-04-20 00:55:17 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:17.828113 | orchestrator | 2026-04-20 00:55:17 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:17.828153 | orchestrator | 2026-04-20 00:55:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:20.885847 | orchestrator | 2026-04-20 00:55:20 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:20.886490 | orchestrator | 2026-04-20 00:55:20 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:20.888514 | orchestrator | 2026-04-20 00:55:20 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:20.889025 | orchestrator | 2026-04-20 00:55:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:23.929148 | orchestrator | 2026-04-20 00:55:23 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:23.930840 | orchestrator | 2026-04-20 00:55:23 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:23.932704 | orchestrator | 2026-04-20 00:55:23 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:23.932746 | orchestrator | 2026-04-20 00:55:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:26.963741 | orchestrator | 2026-04-20 00:55:26 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state STARTED 2026-04-20 00:55:26.964678 | orchestrator | 2026-04-20 00:55:26 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:26.968477 | orchestrator | 2026-04-20 00:55:26 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:26.968532 | orchestrator | 2026-04-20 00:55:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:30.022955 | orchestrator | 2026-04-20 00:55:30 | INFO  | Task b3861f08-0766-4fae-94e3-8c11a1339d47 is in state SUCCESS 2026-04-20 00:55:30.023737 | orchestrator | 2026-04-20 00:55:30.023790 | orchestrator | 2026-04-20 00:55:30.023800 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:55:30.023808 | orchestrator | 2026-04-20 00:55:30.023815 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:55:30.023822 | orchestrator | Monday 20 April 2026 00:55:00 +0000 (0:00:00.311) 0:00:00.311 ********** 2026-04-20 00:55:30.023829 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.023837 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.023844 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.023851 | orchestrator | 2026-04-20 00:55:30.023858 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:55:30.023865 | orchestrator | Monday 20 April 2026 00:55:01 +0000 (0:00:00.277) 0:00:00.588 ********** 2026-04-20 00:55:30.023872 | orchestrator | ok: [testbed-node-0] => (item=enable_horizon_True) 2026-04-20 00:55:30.023879 | orchestrator | ok: [testbed-node-1] => (item=enable_horizon_True) 2026-04-20 00:55:30.023886 | orchestrator | ok: [testbed-node-2] => (item=enable_horizon_True) 2026-04-20 00:55:30.023892 | orchestrator | 2026-04-20 00:55:30.023899 | orchestrator | PLAY [Apply role horizon] ****************************************************** 2026-04-20 00:55:30.023906 | orchestrator | 2026-04-20 00:55:30.023913 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-20 00:55:30.023919 | orchestrator | Monday 20 April 2026 00:55:01 +0000 (0:00:00.319) 0:00:00.908 ********** 2026-04-20 00:55:30.023926 | orchestrator | included: /ansible/roles/horizon/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:55:30.023934 | orchestrator | 2026-04-20 00:55:30.023941 | orchestrator | TASK [horizon : Ensuring config directories exist] ***************************** 2026-04-20 00:55:30.023948 | orchestrator | Monday 20 April 2026 00:55:01 +0000 (0:00:00.590) 0:00:01.498 ********** 2026-04-20 00:55:30.023958 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.024006 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.024023 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.024031 | orchestrator | 2026-04-20 00:55:30.024040 | orchestrator | TASK [horizon : Set empty custom policy] *************************************** 2026-04-20 00:55:30.024047 | orchestrator | Monday 20 April 2026 00:55:03 +0000 (0:00:01.391) 0:00:02.890 ********** 2026-04-20 00:55:30.024054 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.024061 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.024067 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.024356 | orchestrator | 2026-04-20 00:55:30.024383 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-20 00:55:30.024391 | orchestrator | Monday 20 April 2026 00:55:03 +0000 (0:00:00.297) 0:00:03.188 ********** 2026-04-20 00:55:30.024397 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'cloudkitty', 'enabled': False})  2026-04-20 00:55:30.024404 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'heat', 'enabled': 'no'})  2026-04-20 00:55:30.024410 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'ironic', 'enabled': False})  2026-04-20 00:55:30.024417 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'masakari', 'enabled': False})  2026-04-20 00:55:30.024423 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'mistral', 'enabled': False})  2026-04-20 00:55:30.024429 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'tacker', 'enabled': False})  2026-04-20 00:55:30.024435 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'trove', 'enabled': False})  2026-04-20 00:55:30.024441 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'watcher', 'enabled': False})  2026-04-20 00:55:30.024448 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'cloudkitty', 'enabled': False})  2026-04-20 00:55:30.024454 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'heat', 'enabled': 'no'})  2026-04-20 00:55:30.024468 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'ironic', 'enabled': False})  2026-04-20 00:55:30.024474 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'masakari', 'enabled': False})  2026-04-20 00:55:30.024480 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'mistral', 'enabled': False})  2026-04-20 00:55:30.024486 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'tacker', 'enabled': False})  2026-04-20 00:55:30.024493 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'trove', 'enabled': False})  2026-04-20 00:55:30.024499 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'watcher', 'enabled': False})  2026-04-20 00:55:30.024505 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'cloudkitty', 'enabled': False})  2026-04-20 00:55:30.024511 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'heat', 'enabled': 'no'})  2026-04-20 00:55:30.024517 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'ironic', 'enabled': False})  2026-04-20 00:55:30.024523 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'masakari', 'enabled': False})  2026-04-20 00:55:30.024529 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'mistral', 'enabled': False})  2026-04-20 00:55:30.024536 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'tacker', 'enabled': False})  2026-04-20 00:55:30.024542 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'trove', 'enabled': False})  2026-04-20 00:55:30.024549 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'watcher', 'enabled': False})  2026-04-20 00:55:30.024556 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'ceilometer', 'enabled': 'yes'}) 2026-04-20 00:55:30.024564 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'cinder', 'enabled': 'yes'}) 2026-04-20 00:55:30.024571 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'designate', 'enabled': True}) 2026-04-20 00:55:30.024577 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'glance', 'enabled': True}) 2026-04-20 00:55:30.024584 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'keystone', 'enabled': True}) 2026-04-20 00:55:30.024590 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'magnum', 'enabled': True}) 2026-04-20 00:55:30.024596 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'manila', 'enabled': True}) 2026-04-20 00:55:30.024603 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'neutron', 'enabled': True}) 2026-04-20 00:55:30.024609 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'nova', 'enabled': True}) 2026-04-20 00:55:30.024616 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'octavia', 'enabled': True}) 2026-04-20 00:55:30.024621 | orchestrator | 2026-04-20 00:55:30.024633 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.024640 | orchestrator | Monday 20 April 2026 00:55:04 +0000 (0:00:00.729) 0:00:03.917 ********** 2026-04-20 00:55:30.024646 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.024653 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.024665 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.024671 | orchestrator | 2026-04-20 00:55:30.024677 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.024689 | orchestrator | Monday 20 April 2026 00:55:04 +0000 (0:00:00.473) 0:00:04.391 ********** 2026-04-20 00:55:30.024695 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.024702 | orchestrator | 2026-04-20 00:55:30.024709 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.024715 | orchestrator | Monday 20 April 2026 00:55:04 +0000 (0:00:00.111) 0:00:04.502 ********** 2026-04-20 00:55:30.024722 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.024729 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.024736 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.024742 | orchestrator | 2026-04-20 00:55:30.024749 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.024755 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:00.294) 0:00:04.796 ********** 2026-04-20 00:55:30.024761 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.024767 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.024773 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.024779 | orchestrator | 2026-04-20 00:55:30.024785 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.024792 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:00.272) 0:00:05.069 ********** 2026-04-20 00:55:30.024798 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.024804 | orchestrator | 2026-04-20 00:55:30.024810 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.024817 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:00.142) 0:00:05.212 ********** 2026-04-20 00:55:30.024823 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.024829 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.024836 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.024842 | orchestrator | 2026-04-20 00:55:30.024848 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.024854 | orchestrator | Monday 20 April 2026 00:55:06 +0000 (0:00:00.504) 0:00:05.716 ********** 2026-04-20 00:55:30.024861 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.024867 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.024873 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.024878 | orchestrator | 2026-04-20 00:55:30.024884 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.024890 | orchestrator | Monday 20 April 2026 00:55:06 +0000 (0:00:00.383) 0:00:06.099 ********** 2026-04-20 00:55:30.024896 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.024917 | orchestrator | 2026-04-20 00:55:30.024923 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.024929 | orchestrator | Monday 20 April 2026 00:55:06 +0000 (0:00:00.140) 0:00:06.240 ********** 2026-04-20 00:55:30.024935 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.024941 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.024947 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.024954 | orchestrator | 2026-04-20 00:55:30.024960 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.024966 | orchestrator | Monday 20 April 2026 00:55:06 +0000 (0:00:00.253) 0:00:06.495 ********** 2026-04-20 00:55:30.024973 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.024980 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.024986 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.024993 | orchestrator | 2026-04-20 00:55:30.025000 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025007 | orchestrator | Monday 20 April 2026 00:55:07 +0000 (0:00:00.288) 0:00:06.783 ********** 2026-04-20 00:55:30.025014 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025021 | orchestrator | 2026-04-20 00:55:30.025027 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025034 | orchestrator | Monday 20 April 2026 00:55:07 +0000 (0:00:00.110) 0:00:06.893 ********** 2026-04-20 00:55:30.025047 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025054 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025061 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025069 | orchestrator | 2026-04-20 00:55:30.025084 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.025097 | orchestrator | Monday 20 April 2026 00:55:07 +0000 (0:00:00.440) 0:00:07.334 ********** 2026-04-20 00:55:30.025105 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.025111 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.025119 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.025126 | orchestrator | 2026-04-20 00:55:30.025132 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025139 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.299) 0:00:07.633 ********** 2026-04-20 00:55:30.025146 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025153 | orchestrator | 2026-04-20 00:55:30.025160 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025167 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.123) 0:00:07.757 ********** 2026-04-20 00:55:30.025174 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025181 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025188 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025196 | orchestrator | 2026-04-20 00:55:30.025203 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.025210 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.255) 0:00:08.013 ********** 2026-04-20 00:55:30.025218 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.025226 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.025233 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.025239 | orchestrator | 2026-04-20 00:55:30.025247 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025258 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.319) 0:00:08.332 ********** 2026-04-20 00:55:30.025265 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025272 | orchestrator | 2026-04-20 00:55:30.025279 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025286 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.212) 0:00:08.545 ********** 2026-04-20 00:55:30.025301 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025308 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025315 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025322 | orchestrator | 2026-04-20 00:55:30.025426 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.025434 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.471) 0:00:09.016 ********** 2026-04-20 00:55:30.025441 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.025448 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.025455 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.025462 | orchestrator | 2026-04-20 00:55:30.025468 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025475 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.374) 0:00:09.391 ********** 2026-04-20 00:55:30.025482 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025489 | orchestrator | 2026-04-20 00:55:30.025495 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025501 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.107) 0:00:09.499 ********** 2026-04-20 00:55:30.025508 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025514 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025520 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025527 | orchestrator | 2026-04-20 00:55:30.025533 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.025540 | orchestrator | Monday 20 April 2026 00:55:10 +0000 (0:00:00.371) 0:00:09.870 ********** 2026-04-20 00:55:30.025546 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.025559 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.025565 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.025572 | orchestrator | 2026-04-20 00:55:30.025578 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025585 | orchestrator | Monday 20 April 2026 00:55:10 +0000 (0:00:00.333) 0:00:10.204 ********** 2026-04-20 00:55:30.025591 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025598 | orchestrator | 2026-04-20 00:55:30.025605 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025611 | orchestrator | Monday 20 April 2026 00:55:10 +0000 (0:00:00.301) 0:00:10.505 ********** 2026-04-20 00:55:30.025618 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025624 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025630 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025637 | orchestrator | 2026-04-20 00:55:30.025643 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.025649 | orchestrator | Monday 20 April 2026 00:55:11 +0000 (0:00:00.292) 0:00:10.798 ********** 2026-04-20 00:55:30.025656 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.025662 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.025669 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.025676 | orchestrator | 2026-04-20 00:55:30.025683 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025689 | orchestrator | Monday 20 April 2026 00:55:11 +0000 (0:00:00.307) 0:00:11.105 ********** 2026-04-20 00:55:30.025696 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025702 | orchestrator | 2026-04-20 00:55:30.025709 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025716 | orchestrator | Monday 20 April 2026 00:55:11 +0000 (0:00:00.121) 0:00:11.227 ********** 2026-04-20 00:55:30.025722 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025729 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025736 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025742 | orchestrator | 2026-04-20 00:55:30.025748 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2026-04-20 00:55:30.025755 | orchestrator | Monday 20 April 2026 00:55:12 +0000 (0:00:00.356) 0:00:11.584 ********** 2026-04-20 00:55:30.025762 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:30.025768 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:30.025775 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:30.025781 | orchestrator | 2026-04-20 00:55:30.025788 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2026-04-20 00:55:30.025794 | orchestrator | Monday 20 April 2026 00:55:12 +0000 (0:00:00.463) 0:00:12.047 ********** 2026-04-20 00:55:30.025799 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025806 | orchestrator | 2026-04-20 00:55:30.025812 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2026-04-20 00:55:30.025819 | orchestrator | Monday 20 April 2026 00:55:12 +0000 (0:00:00.117) 0:00:12.165 ********** 2026-04-20 00:55:30.025825 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.025831 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.025838 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.025844 | orchestrator | 2026-04-20 00:55:30.025851 | orchestrator | TASK [horizon : Copying over config.json files for services] ******************* 2026-04-20 00:55:30.025858 | orchestrator | Monday 20 April 2026 00:55:12 +0000 (0:00:00.297) 0:00:12.462 ********** 2026-04-20 00:55:30.025864 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:55:30.025871 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:55:30.025878 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:55:30.025884 | orchestrator | 2026-04-20 00:55:30.025891 | orchestrator | TASK [horizon : Copying over horizon.conf] ************************************* 2026-04-20 00:55:30.025897 | orchestrator | Monday 20 April 2026 00:55:14 +0000 (0:00:01.837) 0:00:14.300 ********** 2026-04-20 00:55:30.025903 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-04-20 00:55:30.025915 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-04-20 00:55:30.025921 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2026-04-20 00:55:30.025928 | orchestrator | 2026-04-20 00:55:30.025935 | orchestrator | TASK [horizon : Copying over kolla-settings.py] ******************************** 2026-04-20 00:55:30.025945 | orchestrator | Monday 20 April 2026 00:55:16 +0000 (0:00:02.093) 0:00:16.393 ********** 2026-04-20 00:55:30.025952 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-04-20 00:55:30.025966 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-04-20 00:55:30.025972 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2026-04-20 00:55:30.025978 | orchestrator | 2026-04-20 00:55:30.025983 | orchestrator | TASK [horizon : Copying over custom-settings.py] ******************************* 2026-04-20 00:55:30.025989 | orchestrator | Monday 20 April 2026 00:55:18 +0000 (0:00:02.112) 0:00:18.505 ********** 2026-04-20 00:55:30.025994 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-04-20 00:55:30.025999 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-04-20 00:55:30.026005 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2026-04-20 00:55:30.026048 | orchestrator | 2026-04-20 00:55:30.026057 | orchestrator | TASK [horizon : Copying over existing policy file] ***************************** 2026-04-20 00:55:30.026064 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:01.367) 0:00:19.873 ********** 2026-04-20 00:55:30.026070 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.026076 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.026082 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.026087 | orchestrator | 2026-04-20 00:55:30.026093 | orchestrator | TASK [horizon : Copying over custom themes] ************************************ 2026-04-20 00:55:30.026098 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:00.241) 0:00:20.114 ********** 2026-04-20 00:55:30.026104 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.026110 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.026115 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.026121 | orchestrator | 2026-04-20 00:55:30.026127 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-20 00:55:30.026133 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:00.246) 0:00:20.361 ********** 2026-04-20 00:55:30.026139 | orchestrator | included: /ansible/roles/horizon/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:55:30.026145 | orchestrator | 2026-04-20 00:55:30.026151 | orchestrator | TASK [service-cert-copy : horizon | Copying over extra CA certificates] ******** 2026-04-20 00:55:30.026157 | orchestrator | Monday 20 April 2026 00:55:21 +0000 (0:00:00.590) 0:00:20.952 ********** 2026-04-20 00:55:30.026169 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.026190 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.026208 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.026219 | orchestrator | 2026-04-20 00:55:30.026225 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS certificate] *** 2026-04-20 00:55:30.026232 | orchestrator | Monday 20 April 2026 00:55:23 +0000 (0:00:01.633) 0:00:22.585 ********** 2026-04-20 00:55:30.026238 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026248 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.026262 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026270 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.026277 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026289 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.026295 | orchestrator | 2026-04-20 00:55:30.026302 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS key] ***** 2026-04-20 00:55:30.026308 | orchestrator | Monday 20 April 2026 00:55:23 +0000 (0:00:00.641) 0:00:23.227 ********** 2026-04-20 00:55:30.026323 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026347 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.026354 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026366 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.026381 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026390 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.026397 | orchestrator | 2026-04-20 00:55:30.026404 | orchestrator | TASK [service-check-containers : horizon | Check containers] ******************* 2026-04-20 00:55:30.026415 | orchestrator | Monday 20 April 2026 00:55:25 +0000 (0:00:01.357) 0:00:24.584 ********** 2026-04-20 00:55:30.026429 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.026437 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.026456 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2026-04-20 00:55:30.026464 | orchestrator | 2026-04-20 00:55:30.026471 | orchestrator | TASK [service-check-containers : horizon | Notify handlers to restart containers] *** 2026-04-20 00:55:30.026478 | orchestrator | Monday 20 April 2026 00:55:26 +0000 (0:00:01.727) 0:00:26.312 ********** 2026-04-20 00:55:30.026484 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:55:30.026491 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:55:30.026498 | orchestrator | } 2026-04-20 00:55:30.026504 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:55:30.026511 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:55:30.026518 | orchestrator | } 2026-04-20 00:55:30.026524 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:55:30.026530 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:55:30.026537 | orchestrator | } 2026-04-20 00:55:30.026543 | orchestrator | 2026-04-20 00:55:30.026549 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:55:30.026555 | orchestrator | Monday 20 April 2026 00:55:27 +0000 (0:00:00.325) 0:00:26.637 ********** 2026-04-20 00:55:30.026562 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026573 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.026605 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026617 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.026630 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/horizon:25.3.3.20260328', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_VENUS': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin', 'option httpchk'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2026-04-20 00:55:30.026636 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.026642 | orchestrator | 2026-04-20 00:55:30.026648 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-20 00:55:30.026653 | orchestrator | Monday 20 April 2026 00:55:28 +0000 (0:00:01.247) 0:00:27.885 ********** 2026-04-20 00:55:30.026662 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:30.026668 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:30.026674 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:30.026680 | orchestrator | 2026-04-20 00:55:30.026686 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2026-04-20 00:55:30.026691 | orchestrator | Monday 20 April 2026 00:55:28 +0000 (0:00:00.258) 0:00:28.143 ********** 2026-04-20 00:55:30.026698 | orchestrator | included: /ansible/roles/horizon/tasks/bootstrap.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:55:30.026704 | orchestrator | 2026-04-20 00:55:30.026711 | orchestrator | TASK [horizon : Creating Horizon database] ************************************* 2026-04-20 00:55:30.026716 | orchestrator | Monday 20 April 2026 00:55:29 +0000 (0:00:00.474) 0:00:28.618 ********** 2026-04-20 00:55:30.026722 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:55:30.026728 | orchestrator | 2026-04-20 00:55:30.026735 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:55:30.026745 | orchestrator | testbed-node-0 : ok=34  changed=8  unreachable=0 failed=1  skipped=26  rescued=0 ignored=0 2026-04-20 00:55:30.026753 | orchestrator | testbed-node-1 : ok=34  changed=8  unreachable=0 failed=0 skipped=16  rescued=0 ignored=0 2026-04-20 00:55:30.026760 | orchestrator | testbed-node-2 : ok=34  changed=8  unreachable=0 failed=0 skipped=16  rescued=0 ignored=0 2026-04-20 00:55:30.026766 | orchestrator | 2026-04-20 00:55:30.026773 | orchestrator | 2026-04-20 00:55:30.026779 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:55:30.026785 | orchestrator | Monday 20 April 2026 00:55:29 +0000 (0:00:00.686) 0:00:29.304 ********** 2026-04-20 00:55:30.026791 | orchestrator | =============================================================================== 2026-04-20 00:55:30.026797 | orchestrator | horizon : Copying over kolla-settings.py -------------------------------- 2.11s 2026-04-20 00:55:30.026803 | orchestrator | horizon : Copying over horizon.conf ------------------------------------- 2.09s 2026-04-20 00:55:30.026809 | orchestrator | horizon : Copying over config.json files for services ------------------- 1.84s 2026-04-20 00:55:30.026816 | orchestrator | service-check-containers : horizon | Check containers ------------------- 1.73s 2026-04-20 00:55:30.026822 | orchestrator | service-cert-copy : horizon | Copying over extra CA certificates -------- 1.63s 2026-04-20 00:55:30.026828 | orchestrator | horizon : Ensuring config directories exist ----------------------------- 1.39s 2026-04-20 00:55:30.026834 | orchestrator | horizon : Copying over custom-settings.py ------------------------------- 1.37s 2026-04-20 00:55:30.026841 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS key ----- 1.36s 2026-04-20 00:55:30.026847 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.25s 2026-04-20 00:55:30.026854 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.73s 2026-04-20 00:55:30.026860 | orchestrator | horizon : Creating Horizon database ------------------------------------- 0.69s 2026-04-20 00:55:30.026866 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS certificate --- 0.64s 2026-04-20 00:55:30.026872 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.59s 2026-04-20 00:55:30.026879 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.59s 2026-04-20 00:55:30.026886 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.50s 2026-04-20 00:55:30.026893 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.47s 2026-04-20 00:55:30.026899 | orchestrator | horizon : Update policy file name --------------------------------------- 0.47s 2026-04-20 00:55:30.026905 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.47s 2026-04-20 00:55:30.026911 | orchestrator | horizon : Update policy file name --------------------------------------- 0.46s 2026-04-20 00:55:30.026918 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.44s 2026-04-20 00:55:30.026924 | orchestrator | 2026-04-20 00:55:30 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:30.026931 | orchestrator | 2026-04-20 00:55:30 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:30.026937 | orchestrator | 2026-04-20 00:55:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:33.056025 | orchestrator | 2026-04-20 00:55:33 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:33.056087 | orchestrator | 2026-04-20 00:55:33 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:33.056096 | orchestrator | 2026-04-20 00:55:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:36.119127 | orchestrator | 2026-04-20 00:55:36 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:36.120758 | orchestrator | 2026-04-20 00:55:36 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:36.120811 | orchestrator | 2026-04-20 00:55:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:39.155822 | orchestrator | 2026-04-20 00:55:39 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:39.159592 | orchestrator | 2026-04-20 00:55:39 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:39.159825 | orchestrator | 2026-04-20 00:55:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:42.199498 | orchestrator | 2026-04-20 00:55:42 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:42.201501 | orchestrator | 2026-04-20 00:55:42 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state STARTED 2026-04-20 00:55:42.201555 | orchestrator | 2026-04-20 00:55:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:45.234534 | orchestrator | 2026-04-20 00:55:45 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:55:45.235194 | orchestrator | 2026-04-20 00:55:45 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:55:45.236072 | orchestrator | 2026-04-20 00:55:45 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:55:45.238915 | orchestrator | 2026-04-20 00:55:45 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:45.241743 | orchestrator | 2026-04-20 00:55:45 | INFO  | Task 373cb906-bfdf-4b1b-930c-055623da957a is in state SUCCESS 2026-04-20 00:55:45.243170 | orchestrator | 2026-04-20 00:55:45.243200 | orchestrator | 2026-04-20 00:55:45.243207 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:55:45.243214 | orchestrator | 2026-04-20 00:55:45.243221 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:55:45.243227 | orchestrator | Monday 20 April 2026 00:55:00 +0000 (0:00:00.318) 0:00:00.318 ********** 2026-04-20 00:55:45.243233 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:45.243241 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:45.243247 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:45.243253 | orchestrator | 2026-04-20 00:55:45.243260 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:55:45.243266 | orchestrator | Monday 20 April 2026 00:55:00 +0000 (0:00:00.276) 0:00:00.595 ********** 2026-04-20 00:55:45.243273 | orchestrator | ok: [testbed-node-0] => (item=enable_keystone_True) 2026-04-20 00:55:45.243280 | orchestrator | ok: [testbed-node-1] => (item=enable_keystone_True) 2026-04-20 00:55:45.243286 | orchestrator | ok: [testbed-node-2] => (item=enable_keystone_True) 2026-04-20 00:55:45.243301 | orchestrator | 2026-04-20 00:55:45.243309 | orchestrator | PLAY [Apply role keystone] ***************************************************** 2026-04-20 00:55:45.243379 | orchestrator | 2026-04-20 00:55:45.243386 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-20 00:55:45.243393 | orchestrator | Monday 20 April 2026 00:55:01 +0000 (0:00:00.289) 0:00:00.885 ********** 2026-04-20 00:55:45.243399 | orchestrator | included: /ansible/roles/keystone/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:55:45.243406 | orchestrator | 2026-04-20 00:55:45.243412 | orchestrator | TASK [keystone : Ensuring config directories exist] **************************** 2026-04-20 00:55:45.243418 | orchestrator | Monday 20 April 2026 00:55:01 +0000 (0:00:00.644) 0:00:01.529 ********** 2026-04-20 00:55:45.243427 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.243519 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.243683 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.243718 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.243773 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.243837 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.243894 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.243952 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244076 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244138 | orchestrator | 2026-04-20 00:55:45.244150 | orchestrator | TASK [keystone : Check if policies shall be overwritten] *********************** 2026-04-20 00:55:45.244178 | orchestrator | Monday 20 April 2026 00:55:03 +0000 (0:00:02.101) 0:00:03.631 ********** 2026-04-20 00:55:45.244185 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.244192 | orchestrator | 2026-04-20 00:55:45.244198 | orchestrator | TASK [keystone : Set keystone policy file] ************************************* 2026-04-20 00:55:45.244232 | orchestrator | Monday 20 April 2026 00:55:04 +0000 (0:00:00.132) 0:00:03.764 ********** 2026-04-20 00:55:45.244240 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.244246 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.244252 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.244277 | orchestrator | 2026-04-20 00:55:45.244300 | orchestrator | TASK [keystone : Check if Keystone domain-specific config is supplied] ********* 2026-04-20 00:55:45.244307 | orchestrator | Monday 20 April 2026 00:55:04 +0000 (0:00:00.252) 0:00:04.016 ********** 2026-04-20 00:55:45.244381 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:55:45.244388 | orchestrator | 2026-04-20 00:55:45.244404 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-20 00:55:45.244432 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:00.901) 0:00:04.918 ********** 2026-04-20 00:55:45.244439 | orchestrator | included: /ansible/roles/keystone/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:55:45.244446 | orchestrator | 2026-04-20 00:55:45.244452 | orchestrator | TASK [service-cert-copy : keystone | Copying over extra CA certificates] ******* 2026-04-20 00:55:45.244459 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:00.660) 0:00:05.579 ********** 2026-04-20 00:55:45.244488 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.244503 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.244547 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.244559 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244573 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244627 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244665 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244690 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244725 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.244733 | orchestrator | 2026-04-20 00:55:45.244763 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS certificate] *** 2026-04-20 00:55:45.244770 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:02.738) 0:00:08.318 ********** 2026-04-20 00:55:45.244783 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.244790 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.244836 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.244845 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.244876 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.244889 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.244901 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.244926 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.244934 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.244963 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.244970 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.244977 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.244984 | orchestrator | 2026-04-20 00:55:45.245008 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS key] **** 2026-04-20 00:55:45.245017 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.581) 0:00:08.899 ********** 2026-04-20 00:55:45.245048 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.245060 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.245067 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.245074 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.245104 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.245113 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.245124 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.245167 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.245175 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.245182 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.245188 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.245194 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.245201 | orchestrator | 2026-04-20 00:55:45.245210 | orchestrator | TASK [keystone : Copying over config.json files for services] ****************** 2026-04-20 00:55:45.245217 | orchestrator | Monday 20 April 2026 00:55:10 +0000 (0:00:01.175) 0:00:10.075 ********** 2026-04-20 00:55:45.245264 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.245307 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.245359 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.245369 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245437 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245448 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245514 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245524 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245580 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245597 | orchestrator | 2026-04-20 00:55:45.245604 | orchestrator | TASK [keystone : Copying over keystone.conf] *********************************** 2026-04-20 00:55:45.245610 | orchestrator | Monday 20 April 2026 00:55:13 +0000 (0:00:02.907) 0:00:12.983 ********** 2026-04-20 00:55:45.245621 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.245672 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.245709 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.245718 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.245725 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.245779 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.245806 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245826 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245834 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.245861 | orchestrator | 2026-04-20 00:55:45.245894 | orchestrator | TASK [keystone : Copying keystone-startup script for keystone] ***************** 2026-04-20 00:55:45.245920 | orchestrator | Monday 20 April 2026 00:55:18 +0000 (0:00:05.052) 0:00:18.036 ********** 2026-04-20 00:55:45.245927 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:55:45.245935 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:55:45.245960 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:55:45.245969 | orchestrator | 2026-04-20 00:55:45.245976 | orchestrator | TASK [keystone : Create Keystone domain-specific config directory] ************* 2026-04-20 00:55:45.246000 | orchestrator | Monday 20 April 2026 00:55:19 +0000 (0:00:01.159) 0:00:19.195 ********** 2026-04-20 00:55:45.246007 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.246068 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.246105 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.246113 | orchestrator | 2026-04-20 00:55:45.246140 | orchestrator | TASK [keystone : Get file list in custom domains folder] *********************** 2026-04-20 00:55:45.246149 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:00.716) 0:00:19.911 ********** 2026-04-20 00:55:45.246174 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.246180 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.246186 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.246192 | orchestrator | 2026-04-20 00:55:45.246216 | orchestrator | TASK [keystone : Copying Keystone Domain specific settings] ******************** 2026-04-20 00:55:45.246241 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:00.397) 0:00:20.309 ********** 2026-04-20 00:55:45.246248 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.246254 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.246261 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.246287 | orchestrator | 2026-04-20 00:55:45.246293 | orchestrator | TASK [keystone : Copying over existing policy file] **************************** 2026-04-20 00:55:45.246300 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:00.276) 0:00:20.586 ********** 2026-04-20 00:55:45.246387 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.246408 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.246449 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.246457 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.246467 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.246476 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.246521 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.246530 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.246557 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.246592 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.246601 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.246609 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.246654 | orchestrator | 2026-04-20 00:55:45.246662 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-20 00:55:45.246668 | orchestrator | Monday 20 April 2026 00:55:21 +0000 (0:00:00.473) 0:00:21.059 ********** 2026-04-20 00:55:45.246674 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.246704 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.246711 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.246717 | orchestrator | 2026-04-20 00:55:45.246723 | orchestrator | TASK [keystone : Copying over wsgi-keystone.conf] ****************************** 2026-04-20 00:55:45.246745 | orchestrator | Monday 20 April 2026 00:55:21 +0000 (0:00:00.244) 0:00:21.303 ********** 2026-04-20 00:55:45.246752 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-04-20 00:55:45.246780 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-04-20 00:55:45.246787 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2026-04-20 00:55:45.246814 | orchestrator | 2026-04-20 00:55:45.246821 | orchestrator | TASK [keystone : Checking whether keystone-paste.ini file exists] ************** 2026-04-20 00:55:45.246827 | orchestrator | Monday 20 April 2026 00:55:23 +0000 (0:00:01.846) 0:00:23.150 ********** 2026-04-20 00:55:45.246833 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:55:45.246838 | orchestrator | 2026-04-20 00:55:45.246879 | orchestrator | TASK [keystone : Copying over keystone-paste.ini] ****************************** 2026-04-20 00:55:45.246905 | orchestrator | Monday 20 April 2026 00:55:24 +0000 (0:00:01.029) 0:00:24.179 ********** 2026-04-20 00:55:45.246928 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.246935 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.246941 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.246947 | orchestrator | 2026-04-20 00:55:45.246953 | orchestrator | TASK [keystone : Generate the required cron jobs for the node] ***************** 2026-04-20 00:55:45.246996 | orchestrator | Monday 20 April 2026 00:55:25 +0000 (0:00:00.635) 0:00:24.815 ********** 2026-04-20 00:55:45.247003 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 00:55:45.247024 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-20 00:55:45.247030 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-20 00:55:45.247036 | orchestrator | 2026-04-20 00:55:45.247042 | orchestrator | TASK [keystone : Set fact with the generated cron jobs for building the crontab later] *** 2026-04-20 00:55:45.247072 | orchestrator | Monday 20 April 2026 00:55:26 +0000 (0:00:01.745) 0:00:26.561 ********** 2026-04-20 00:55:45.247100 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:55:45.247107 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:55:45.247113 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:55:45.247119 | orchestrator | 2026-04-20 00:55:45.247160 | orchestrator | TASK [keystone : Copying files for keystone-fernet] **************************** 2026-04-20 00:55:45.247186 | orchestrator | Monday 20 April 2026 00:55:27 +0000 (0:00:00.316) 0:00:26.877 ********** 2026-04-20 00:55:45.247193 | orchestrator | changed: [testbed-node-0] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-04-20 00:55:45.247218 | orchestrator | changed: [testbed-node-1] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-04-20 00:55:45.247224 | orchestrator | changed: [testbed-node-2] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2026-04-20 00:55:45.247230 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-04-20 00:55:45.247254 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-04-20 00:55:45.247260 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2026-04-20 00:55:45.247282 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-04-20 00:55:45.247290 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-04-20 00:55:45.247296 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2026-04-20 00:55:45.247303 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-04-20 00:55:45.247308 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-04-20 00:55:45.247350 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2026-04-20 00:55:45.247357 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-04-20 00:55:45.247380 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-04-20 00:55:45.247398 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2026-04-20 00:55:45.247404 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-04-20 00:55:45.247436 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-04-20 00:55:45.247443 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2026-04-20 00:55:45.247449 | orchestrator | changed: [testbed-node-1] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-04-20 00:55:45.247474 | orchestrator | changed: [testbed-node-0] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-04-20 00:55:45.247480 | orchestrator | changed: [testbed-node-2] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2026-04-20 00:55:45.247486 | orchestrator | 2026-04-20 00:55:45.247507 | orchestrator | TASK [keystone : Copying files for keystone-ssh] ******************************* 2026-04-20 00:55:45.247515 | orchestrator | Monday 20 April 2026 00:55:35 +0000 (0:00:07.921) 0:00:34.799 ********** 2026-04-20 00:55:45.247544 | orchestrator | changed: [testbed-node-0] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-04-20 00:55:45.247567 | orchestrator | changed: [testbed-node-1] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-04-20 00:55:45.247574 | orchestrator | changed: [testbed-node-2] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2026-04-20 00:55:45.247580 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-04-20 00:55:45.247587 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-04-20 00:55:45.247593 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2026-04-20 00:55:45.247619 | orchestrator | 2026-04-20 00:55:45.247649 | orchestrator | TASK [service-check-containers : keystone | Check containers] ****************** 2026-04-20 00:55:45.247655 | orchestrator | Monday 20 April 2026 00:55:37 +0000 (0:00:02.370) 0:00:37.170 ********** 2026-04-20 00:55:45.247683 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.247712 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.247755 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}}) 2026-04-20 00:55:45.247763 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.247787 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.247797 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2026-04-20 00:55:45.247822 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.247833 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.247862 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2026-04-20 00:55:45.247886 | orchestrator | 2026-04-20 00:55:45.247893 | orchestrator | TASK [service-check-containers : keystone | Notify handlers to restart containers] *** 2026-04-20 00:55:45.247899 | orchestrator | Monday 20 April 2026 00:55:39 +0000 (0:00:02.258) 0:00:39.429 ********** 2026-04-20 00:55:45.247906 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 00:55:45.247912 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:55:45.247919 | orchestrator | } 2026-04-20 00:55:45.247925 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 00:55:45.247957 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:55:45.247964 | orchestrator | } 2026-04-20 00:55:45.247970 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 00:55:45.247993 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 00:55:45.247999 | orchestrator | } 2026-04-20 00:55:45.248006 | orchestrator | 2026-04-20 00:55:45.248013 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 00:55:45.248038 | orchestrator | Monday 20 April 2026 00:55:40 +0000 (0:00:00.297) 0:00:39.726 ********** 2026-04-20 00:55:45.248066 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.248097 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.248105 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.248135 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.248167 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.248176 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.248182 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.248188 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.248218 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '/dev/shm:/dev/shm', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin', 'option httpchk']}}}})  2026-04-20 00:55:45.248232 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-ssh:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2026-04-20 00:55:45.248263 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/keystone-fernet:27.0.1.20260328', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2026-04-20 00:55:45.248270 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.248277 | orchestrator | 2026-04-20 00:55:45.248283 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2026-04-20 00:55:45.248323 | orchestrator | Monday 20 April 2026 00:55:41 +0000 (0:00:00.935) 0:00:40.662 ********** 2026-04-20 00:55:45.248352 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:55:45.248358 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:55:45.248365 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:55:45.248397 | orchestrator | 2026-04-20 00:55:45.248406 | orchestrator | TASK [keystone : Creating keystone database] *********************************** 2026-04-20 00:55:45.248413 | orchestrator | Monday 20 April 2026 00:55:41 +0000 (0:00:00.265) 0:00:40.927 ********** 2026-04-20 00:55:45.248437 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:55:45.248444 | orchestrator | 2026-04-20 00:55:45.248451 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:55:45.248479 | orchestrator | testbed-node-0 : ok=18  changed=10  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-20 00:55:45.248488 | orchestrator | testbed-node-1 : ok=16  changed=10  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2026-04-20 00:55:45.248514 | orchestrator | testbed-node-2 : ok=16  changed=10  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2026-04-20 00:55:45.248522 | orchestrator | 2026-04-20 00:55:45.248529 | orchestrator | 2026-04-20 00:55:45.248536 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:55:45.248562 | orchestrator | Monday 20 April 2026 00:55:41 +0000 (0:00:00.698) 0:00:41.625 ********** 2026-04-20 00:55:45.248570 | orchestrator | =============================================================================== 2026-04-20 00:55:45.248577 | orchestrator | keystone : Copying files for keystone-fernet ---------------------------- 7.92s 2026-04-20 00:55:45.248584 | orchestrator | keystone : Copying over keystone.conf ----------------------------------- 5.05s 2026-04-20 00:55:45.248617 | orchestrator | keystone : Copying over config.json files for services ------------------ 2.91s 2026-04-20 00:55:45.248624 | orchestrator | service-cert-copy : keystone | Copying over extra CA certificates ------- 2.74s 2026-04-20 00:55:45.248630 | orchestrator | keystone : Copying files for keystone-ssh ------------------------------- 2.37s 2026-04-20 00:55:45.248680 | orchestrator | service-check-containers : keystone | Check containers ------------------ 2.26s 2026-04-20 00:55:45.248688 | orchestrator | keystone : Ensuring config directories exist ---------------------------- 2.10s 2026-04-20 00:55:45.248718 | orchestrator | keystone : Copying over wsgi-keystone.conf ------------------------------ 1.85s 2026-04-20 00:55:45.248744 | orchestrator | keystone : Generate the required cron jobs for the node ----------------- 1.75s 2026-04-20 00:55:45.248751 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS key ---- 1.18s 2026-04-20 00:55:45.248758 | orchestrator | keystone : Copying keystone-startup script for keystone ----------------- 1.16s 2026-04-20 00:55:45.248769 | orchestrator | keystone : Checking whether keystone-paste.ini file exists -------------- 1.03s 2026-04-20 00:55:45.248818 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.94s 2026-04-20 00:55:45.248826 | orchestrator | keystone : Check if Keystone domain-specific config is supplied --------- 0.90s 2026-04-20 00:55:45.248849 | orchestrator | keystone : Create Keystone domain-specific config directory ------------- 0.72s 2026-04-20 00:55:45.248857 | orchestrator | keystone : Creating keystone database ----------------------------------- 0.70s 2026-04-20 00:55:45.248886 | orchestrator | keystone : include_tasks ------------------------------------------------ 0.66s 2026-04-20 00:55:45.248916 | orchestrator | keystone : include_tasks ------------------------------------------------ 0.64s 2026-04-20 00:55:45.248925 | orchestrator | keystone : Copying over keystone-paste.ini ------------------------------ 0.64s 2026-04-20 00:55:45.248932 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS certificate --- 0.58s 2026-04-20 00:55:45.248939 | orchestrator | 2026-04-20 00:55:45 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:55:45.248983 | orchestrator | 2026-04-20 00:55:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:48.262485 | orchestrator | 2026-04-20 00:55:48 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:55:48.262734 | orchestrator | 2026-04-20 00:55:48 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:55:48.265277 | orchestrator | 2026-04-20 00:55:48 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:55:48.265545 | orchestrator | 2026-04-20 00:55:48 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:48.266058 | orchestrator | 2026-04-20 00:55:48 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:55:48.266071 | orchestrator | 2026-04-20 00:55:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:51.299235 | orchestrator | 2026-04-20 00:55:51 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:55:51.301525 | orchestrator | 2026-04-20 00:55:51 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:55:51.303794 | orchestrator | 2026-04-20 00:55:51 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:55:51.305643 | orchestrator | 2026-04-20 00:55:51 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:51.308070 | orchestrator | 2026-04-20 00:55:51 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:55:51.308121 | orchestrator | 2026-04-20 00:55:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:54.363433 | orchestrator | 2026-04-20 00:55:54 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:55:54.366333 | orchestrator | 2026-04-20 00:55:54 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:55:54.368862 | orchestrator | 2026-04-20 00:55:54 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:55:54.371941 | orchestrator | 2026-04-20 00:55:54 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:54.373422 | orchestrator | 2026-04-20 00:55:54 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:55:54.373493 | orchestrator | 2026-04-20 00:55:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:55:57.424838 | orchestrator | 2026-04-20 00:55:57 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:55:57.426896 | orchestrator | 2026-04-20 00:55:57 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:55:57.428994 | orchestrator | 2026-04-20 00:55:57 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:55:57.432160 | orchestrator | 2026-04-20 00:55:57 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:55:57.434407 | orchestrator | 2026-04-20 00:55:57 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:55:57.434545 | orchestrator | 2026-04-20 00:55:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:00.481179 | orchestrator | 2026-04-20 00:56:00 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:00.484329 | orchestrator | 2026-04-20 00:56:00 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:00.486472 | orchestrator | 2026-04-20 00:56:00 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:00.488679 | orchestrator | 2026-04-20 00:56:00 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:00.490820 | orchestrator | 2026-04-20 00:56:00 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:00.491174 | orchestrator | 2026-04-20 00:56:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:03.540447 | orchestrator | 2026-04-20 00:56:03 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:03.541876 | orchestrator | 2026-04-20 00:56:03 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:03.543686 | orchestrator | 2026-04-20 00:56:03 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:03.544980 | orchestrator | 2026-04-20 00:56:03 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:03.545767 | orchestrator | 2026-04-20 00:56:03 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:03.546196 | orchestrator | 2026-04-20 00:56:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:06.597828 | orchestrator | 2026-04-20 00:56:06 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:06.601390 | orchestrator | 2026-04-20 00:56:06 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:06.603310 | orchestrator | 2026-04-20 00:56:06 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:06.605256 | orchestrator | 2026-04-20 00:56:06 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:06.606892 | orchestrator | 2026-04-20 00:56:06 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:06.607275 | orchestrator | 2026-04-20 00:56:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:09.652436 | orchestrator | 2026-04-20 00:56:09 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:09.653971 | orchestrator | 2026-04-20 00:56:09 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:09.655855 | orchestrator | 2026-04-20 00:56:09 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:09.657419 | orchestrator | 2026-04-20 00:56:09 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:09.660408 | orchestrator | 2026-04-20 00:56:09 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:09.660496 | orchestrator | 2026-04-20 00:56:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:12.707446 | orchestrator | 2026-04-20 00:56:12 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:12.709615 | orchestrator | 2026-04-20 00:56:12 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:12.715007 | orchestrator | 2026-04-20 00:56:12 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:12.715337 | orchestrator | 2026-04-20 00:56:12 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:12.717384 | orchestrator | 2026-04-20 00:56:12 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:12.717427 | orchestrator | 2026-04-20 00:56:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:15.769271 | orchestrator | 2026-04-20 00:56:15 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:15.771555 | orchestrator | 2026-04-20 00:56:15 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:15.773199 | orchestrator | 2026-04-20 00:56:15 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:15.774540 | orchestrator | 2026-04-20 00:56:15 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:15.775695 | orchestrator | 2026-04-20 00:56:15 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:15.775736 | orchestrator | 2026-04-20 00:56:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:18.824994 | orchestrator | 2026-04-20 00:56:18 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:18.826722 | orchestrator | 2026-04-20 00:56:18 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:18.828552 | orchestrator | 2026-04-20 00:56:18 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:18.831424 | orchestrator | 2026-04-20 00:56:18 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:18.833542 | orchestrator | 2026-04-20 00:56:18 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:18.833601 | orchestrator | 2026-04-20 00:56:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:21.875899 | orchestrator | 2026-04-20 00:56:21 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:21.877819 | orchestrator | 2026-04-20 00:56:21 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:21.879548 | orchestrator | 2026-04-20 00:56:21 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:21.880921 | orchestrator | 2026-04-20 00:56:21 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:21.882400 | orchestrator | 2026-04-20 00:56:21 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:21.882473 | orchestrator | 2026-04-20 00:56:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:24.931711 | orchestrator | 2026-04-20 00:56:24 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:24.933993 | orchestrator | 2026-04-20 00:56:24 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:24.936006 | orchestrator | 2026-04-20 00:56:24 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:24.938128 | orchestrator | 2026-04-20 00:56:24 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:24.939948 | orchestrator | 2026-04-20 00:56:24 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:24.939988 | orchestrator | 2026-04-20 00:56:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:27.987660 | orchestrator | 2026-04-20 00:56:27 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:27.990140 | orchestrator | 2026-04-20 00:56:27 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:27.991860 | orchestrator | 2026-04-20 00:56:27 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:27.993939 | orchestrator | 2026-04-20 00:56:27 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:27.994996 | orchestrator | 2026-04-20 00:56:27 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:27.995027 | orchestrator | 2026-04-20 00:56:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:31.039824 | orchestrator | 2026-04-20 00:56:31 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:31.041679 | orchestrator | 2026-04-20 00:56:31 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:31.043061 | orchestrator | 2026-04-20 00:56:31 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:31.044695 | orchestrator | 2026-04-20 00:56:31 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:31.046160 | orchestrator | 2026-04-20 00:56:31 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:31.046251 | orchestrator | 2026-04-20 00:56:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:34.093704 | orchestrator | 2026-04-20 00:56:34 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:34.094968 | orchestrator | 2026-04-20 00:56:34 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:34.095923 | orchestrator | 2026-04-20 00:56:34 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:34.096894 | orchestrator | 2026-04-20 00:56:34 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:34.097708 | orchestrator | 2026-04-20 00:56:34 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:34.097756 | orchestrator | 2026-04-20 00:56:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:37.149753 | orchestrator | 2026-04-20 00:56:37 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:37.151256 | orchestrator | 2026-04-20 00:56:37 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:37.152930 | orchestrator | 2026-04-20 00:56:37 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:37.155994 | orchestrator | 2026-04-20 00:56:37 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:37.158098 | orchestrator | 2026-04-20 00:56:37 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:37.158144 | orchestrator | 2026-04-20 00:56:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:40.201629 | orchestrator | 2026-04-20 00:56:40 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state STARTED 2026-04-20 00:56:40.202886 | orchestrator | 2026-04-20 00:56:40 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state STARTED 2026-04-20 00:56:40.205067 | orchestrator | 2026-04-20 00:56:40 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:40.206126 | orchestrator | 2026-04-20 00:56:40 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:40.207512 | orchestrator | 2026-04-20 00:56:40 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:40.207785 | orchestrator | 2026-04-20 00:56:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:43.248155 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:56:43.249680 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task 969e49e8-1788-4df1-b46f-f5763d58cb2c is in state SUCCESS 2026-04-20 00:56:43.250439 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task 9122fa4c-ce3b-430a-9bab-366eba5b1cb8 is in state SUCCESS 2026-04-20 00:56:43.251938 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:43.253393 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:43.254957 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:56:43.256852 | orchestrator | 2026-04-20 00:56:43 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state STARTED 2026-04-20 00:56:43.256914 | orchestrator | 2026-04-20 00:56:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:46.287986 | orchestrator | 2026-04-20 00:56:46 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:56:46.288076 | orchestrator | 2026-04-20 00:56:46 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:46.288875 | orchestrator | 2026-04-20 00:56:46 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:46.289457 | orchestrator | 2026-04-20 00:56:46 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:56:46.290009 | orchestrator | 2026-04-20 00:56:46 | INFO  | Task 117e6416-2862-4780-9b42-711a6badb3a7 is in state SUCCESS 2026-04-20 00:56:46.290511 | orchestrator | 2026-04-20 00:56:46.290542 | orchestrator | 2026-04-20 00:56:46.290552 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:56:46.290562 | orchestrator | 2026-04-20 00:56:46.290572 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:56:46.290582 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.256) 0:00:00.256 ********** 2026-04-20 00:56:46.290591 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:56:46.290601 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:56:46.290610 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:56:46.290619 | orchestrator | 2026-04-20 00:56:46.290627 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:56:46.290636 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.244) 0:00:00.500 ********** 2026-04-20 00:56:46.290645 | orchestrator | ok: [testbed-node-0] => (item=enable_designate_True) 2026-04-20 00:56:46.290680 | orchestrator | ok: [testbed-node-1] => (item=enable_designate_True) 2026-04-20 00:56:46.290690 | orchestrator | ok: [testbed-node-2] => (item=enable_designate_True) 2026-04-20 00:56:46.290699 | orchestrator | 2026-04-20 00:56:46.290707 | orchestrator | PLAY [Apply role designate] **************************************************** 2026-04-20 00:56:46.290716 | orchestrator | 2026-04-20 00:56:46.290724 | orchestrator | TASK [designate : include_tasks] *********************************************** 2026-04-20 00:56:46.290733 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.243) 0:00:00.743 ********** 2026-04-20 00:56:46.290742 | orchestrator | included: /ansible/roles/designate/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:56:46.290752 | orchestrator | 2026-04-20 00:56:46.290761 | orchestrator | TASK [service-ks-register : designate | Creating/deleting services] ************ 2026-04-20 00:56:46.290769 | orchestrator | Monday 20 April 2026 00:55:47 +0000 (0:00:00.608) 0:00:01.352 ********** 2026-04-20 00:56:46.290778 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (5 retries left). 2026-04-20 00:56:46.290823 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (4 retries left). 2026-04-20 00:56:46.290833 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (3 retries left). 2026-04-20 00:56:46.290842 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (2 retries left). 2026-04-20 00:56:46.290850 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating/deleting services (1 retries left). 2026-04-20 00:56:46.290861 | orchestrator | failed: [testbed-node-0] (item=designate (dns)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Designate DNS Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9001"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9001"}], "name": "designate", "type": "dns"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:56:46.290872 | orchestrator | 2026-04-20 00:56:46.290881 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:56:46.290890 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-20 00:56:46.290899 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:56:46.290908 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:56:46.290913 | orchestrator | 2026-04-20 00:56:46.290918 | orchestrator | 2026-04-20 00:56:46.290923 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:56:46.290929 | orchestrator | Monday 20 April 2026 00:56:41 +0000 (0:00:53.706) 0:00:55.058 ********** 2026-04-20 00:56:46.290934 | orchestrator | =============================================================================== 2026-04-20 00:56:46.290939 | orchestrator | service-ks-register : designate | Creating/deleting services ----------- 53.71s 2026-04-20 00:56:46.290945 | orchestrator | designate : include_tasks ----------------------------------------------- 0.61s 2026-04-20 00:56:46.290950 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.24s 2026-04-20 00:56:46.290955 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.24s 2026-04-20 00:56:46.290961 | orchestrator | 2026-04-20 00:56:46.290966 | orchestrator | 2026-04-20 00:56:46.290971 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:56:46.290976 | orchestrator | 2026-04-20 00:56:46.290981 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:56:46.290986 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.231) 0:00:00.231 ********** 2026-04-20 00:56:46.290991 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:56:46.291010 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:56:46.291016 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:56:46.291021 | orchestrator | 2026-04-20 00:56:46.291026 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:56:46.291031 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.262) 0:00:00.494 ********** 2026-04-20 00:56:46.291036 | orchestrator | ok: [testbed-node-0] => (item=enable_barbican_True) 2026-04-20 00:56:46.291041 | orchestrator | ok: [testbed-node-1] => (item=enable_barbican_True) 2026-04-20 00:56:46.291046 | orchestrator | ok: [testbed-node-2] => (item=enable_barbican_True) 2026-04-20 00:56:46.291051 | orchestrator | 2026-04-20 00:56:46.291056 | orchestrator | PLAY [Apply role barbican] ***************************************************** 2026-04-20 00:56:46.291061 | orchestrator | 2026-04-20 00:56:46.291077 | orchestrator | TASK [barbican : include_tasks] ************************************************ 2026-04-20 00:56:46.291083 | orchestrator | Monday 20 April 2026 00:55:47 +0000 (0:00:00.449) 0:00:00.943 ********** 2026-04-20 00:56:46.291088 | orchestrator | included: /ansible/roles/barbican/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:56:46.291093 | orchestrator | 2026-04-20 00:56:46.291098 | orchestrator | TASK [service-ks-register : barbican | Creating/deleting services] ************* 2026-04-20 00:56:46.291104 | orchestrator | Monday 20 April 2026 00:55:48 +0000 (0:00:00.764) 0:00:01.708 ********** 2026-04-20 00:56:46.291110 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (5 retries left). 2026-04-20 00:56:46.291116 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (4 retries left). 2026-04-20 00:56:46.291122 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (3 retries left). 2026-04-20 00:56:46.291128 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (2 retries left). 2026-04-20 00:56:46.291133 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating/deleting services (1 retries left). 2026-04-20 00:56:46.291140 | orchestrator | failed: [testbed-node-0] (item=barbican (key-manager)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Barbican Key Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9311"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9311"}], "name": "barbican", "type": "key-manager"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:56:46.291149 | orchestrator | 2026-04-20 00:56:46.291159 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:56:46.291166 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-20 00:56:46.291172 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:56:46.291178 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:56:46.291184 | orchestrator | 2026-04-20 00:56:46.291190 | orchestrator | 2026-04-20 00:56:46.291196 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:56:46.291202 | orchestrator | Monday 20 April 2026 00:56:41 +0000 (0:00:53.652) 0:00:55.360 ********** 2026-04-20 00:56:46.291208 | orchestrator | =============================================================================== 2026-04-20 00:56:46.291214 | orchestrator | service-ks-register : barbican | Creating/deleting services ------------ 53.65s 2026-04-20 00:56:46.291220 | orchestrator | barbican : include_tasks ------------------------------------------------ 0.76s 2026-04-20 00:56:46.291235 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.45s 2026-04-20 00:56:46.291242 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.26s 2026-04-20 00:56:46.291248 | orchestrator | 2026-04-20 00:56:46.291277 | orchestrator | 2026-04-20 00:56:46.291284 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:56:46.291290 | orchestrator | 2026-04-20 00:56:46.291296 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:56:46.291302 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.226) 0:00:00.226 ********** 2026-04-20 00:56:46.291308 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:56:46.291314 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:56:46.291320 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:56:46.291326 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:56:46.291331 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:56:46.291337 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:56:46.291343 | orchestrator | 2026-04-20 00:56:46.291349 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:56:46.291356 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.474) 0:00:00.701 ********** 2026-04-20 00:56:46.291364 | orchestrator | ok: [testbed-node-0] => (item=enable_neutron_True) 2026-04-20 00:56:46.291373 | orchestrator | ok: [testbed-node-1] => (item=enable_neutron_True) 2026-04-20 00:56:46.291384 | orchestrator | ok: [testbed-node-2] => (item=enable_neutron_True) 2026-04-20 00:56:46.291396 | orchestrator | ok: [testbed-node-3] => (item=enable_neutron_True) 2026-04-20 00:56:46.291409 | orchestrator | ok: [testbed-node-4] => (item=enable_neutron_True) 2026-04-20 00:56:46.291417 | orchestrator | ok: [testbed-node-5] => (item=enable_neutron_True) 2026-04-20 00:56:46.291424 | orchestrator | 2026-04-20 00:56:46.291433 | orchestrator | PLAY [Apply role neutron] ****************************************************** 2026-04-20 00:56:46.291441 | orchestrator | 2026-04-20 00:56:46.291449 | orchestrator | TASK [neutron : include_tasks] ************************************************* 2026-04-20 00:56:46.291458 | orchestrator | Monday 20 April 2026 00:55:47 +0000 (0:00:01.100) 0:00:01.801 ********** 2026-04-20 00:56:46.291465 | orchestrator | included: /ansible/roles/neutron/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:56:46.291473 | orchestrator | 2026-04-20 00:56:46.291481 | orchestrator | TASK [neutron : Get container facts] ******************************************* 2026-04-20 00:56:46.291487 | orchestrator | Monday 20 April 2026 00:55:48 +0000 (0:00:01.002) 0:00:02.804 ********** 2026-04-20 00:56:46.291495 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:56:46.291504 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:56:46.291512 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:56:46.291519 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:56:46.291533 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:56:46.291543 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:56:46.291550 | orchestrator | 2026-04-20 00:56:46.291558 | orchestrator | TASK [neutron : Get container volume facts] ************************************ 2026-04-20 00:56:46.291567 | orchestrator | Monday 20 April 2026 00:55:50 +0000 (0:00:01.285) 0:00:04.089 ********** 2026-04-20 00:56:46.291575 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:56:46.291584 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:56:46.291591 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:56:46.291599 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:56:46.291609 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:56:46.291614 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:56:46.291619 | orchestrator | 2026-04-20 00:56:46.291624 | orchestrator | TASK [neutron : Check for ML2/OVN presence] ************************************ 2026-04-20 00:56:46.291630 | orchestrator | Monday 20 April 2026 00:55:51 +0000 (0:00:01.049) 0:00:05.138 ********** 2026-04-20 00:56:46.291635 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:56:46.291640 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:56:46.291645 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:56:46.291650 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:56:46.291655 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:56:46.291660 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:56:46.291665 | orchestrator | 2026-04-20 00:56:46.291670 | orchestrator | TASK [neutron : Check for ML2/OVS presence] ************************************ 2026-04-20 00:56:46.291681 | orchestrator | Monday 20 April 2026 00:55:51 +0000 (0:00:00.497) 0:00:05.636 ********** 2026-04-20 00:56:46.291688 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:56:46.291696 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:56:46.291704 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:56:46.291712 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:56:46.291719 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:56:46.291726 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:56:46.291734 | orchestrator | 2026-04-20 00:56:46.291741 | orchestrator | TASK [service-ks-register : neutron | Creating/deleting services] ************** 2026-04-20 00:56:46.291749 | orchestrator | Monday 20 April 2026 00:55:52 +0000 (0:00:00.629) 0:00:06.266 ********** 2026-04-20 00:56:46.291763 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (5 retries left). 2026-04-20 00:56:46.291773 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (4 retries left). 2026-04-20 00:56:46.291782 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (3 retries left). 2026-04-20 00:56:46.291790 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (2 retries left). 2026-04-20 00:56:46.291798 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating/deleting services (1 retries left). 2026-04-20 00:56:46.291808 | orchestrator | failed: [testbed-node-0] (item=neutron (network)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Openstack Networking", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9696"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9696"}], "name": "neutron", "type": "network"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:56:46.291817 | orchestrator | 2026-04-20 00:56:46.291822 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:56:46.291827 | orchestrator | testbed-node-0 : ok=5  changed=0 unreachable=0 failed=1  skipped=2  rescued=0 ignored=0 2026-04-20 00:56:46.291833 | orchestrator | testbed-node-1 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:56:46.291838 | orchestrator | testbed-node-2 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:56:46.291843 | orchestrator | testbed-node-3 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:56:46.291848 | orchestrator | testbed-node-4 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:56:46.291853 | orchestrator | testbed-node-5 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 00:56:46.291858 | orchestrator | 2026-04-20 00:56:46.291863 | orchestrator | 2026-04-20 00:56:46.291868 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:56:46.291873 | orchestrator | Monday 20 April 2026 00:56:45 +0000 (0:00:53.255) 0:00:59.521 ********** 2026-04-20 00:56:46.291878 | orchestrator | =============================================================================== 2026-04-20 00:56:46.291883 | orchestrator | service-ks-register : neutron | Creating/deleting services ------------- 53.26s 2026-04-20 00:56:46.291888 | orchestrator | neutron : Get container facts ------------------------------------------- 1.29s 2026-04-20 00:56:46.291893 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.10s 2026-04-20 00:56:46.291898 | orchestrator | neutron : Get container volume facts ------------------------------------ 1.05s 2026-04-20 00:56:46.291903 | orchestrator | neutron : include_tasks ------------------------------------------------- 1.00s 2026-04-20 00:56:46.291913 | orchestrator | neutron : Check for ML2/OVS presence ------------------------------------ 0.63s 2026-04-20 00:56:46.291923 | orchestrator | neutron : Check for ML2/OVN presence ------------------------------------ 0.50s 2026-04-20 00:56:46.291928 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.48s 2026-04-20 00:56:46.291933 | orchestrator | 2026-04-20 00:56:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:49.335951 | orchestrator | 2026-04-20 00:56:49 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:56:49.336167 | orchestrator | 2026-04-20 00:56:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:56:49.337951 | orchestrator | 2026-04-20 00:56:49 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:49.339422 | orchestrator | 2026-04-20 00:56:49 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:49.341091 | orchestrator | 2026-04-20 00:56:49 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:56:49.341137 | orchestrator | 2026-04-20 00:56:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:52.382119 | orchestrator | 2026-04-20 00:56:52 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:56:52.383728 | orchestrator | 2026-04-20 00:56:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:56:52.386630 | orchestrator | 2026-04-20 00:56:52 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:52.388384 | orchestrator | 2026-04-20 00:56:52 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:52.389789 | orchestrator | 2026-04-20 00:56:52 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:56:52.389911 | orchestrator | 2026-04-20 00:56:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:55.430746 | orchestrator | 2026-04-20 00:56:55 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:56:55.431513 | orchestrator | 2026-04-20 00:56:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:56:55.432570 | orchestrator | 2026-04-20 00:56:55 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:55.433732 | orchestrator | 2026-04-20 00:56:55 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:55.434905 | orchestrator | 2026-04-20 00:56:55 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:56:55.434947 | orchestrator | 2026-04-20 00:56:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:56:58.475781 | orchestrator | 2026-04-20 00:56:58 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:56:58.478325 | orchestrator | 2026-04-20 00:56:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:56:58.479514 | orchestrator | 2026-04-20 00:56:58 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:56:58.481349 | orchestrator | 2026-04-20 00:56:58 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:56:58.483066 | orchestrator | 2026-04-20 00:56:58 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:56:58.483114 | orchestrator | 2026-04-20 00:56:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:01.521935 | orchestrator | 2026-04-20 00:57:01 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:01.522989 | orchestrator | 2026-04-20 00:57:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:01.524186 | orchestrator | 2026-04-20 00:57:01 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:57:01.525578 | orchestrator | 2026-04-20 00:57:01 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:01.526710 | orchestrator | 2026-04-20 00:57:01 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:01.526753 | orchestrator | 2026-04-20 00:57:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:04.566566 | orchestrator | 2026-04-20 00:57:04 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:04.568093 | orchestrator | 2026-04-20 00:57:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:04.570161 | orchestrator | 2026-04-20 00:57:04 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:57:04.571777 | orchestrator | 2026-04-20 00:57:04 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:04.572974 | orchestrator | 2026-04-20 00:57:04 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:04.573022 | orchestrator | 2026-04-20 00:57:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:07.630355 | orchestrator | 2026-04-20 00:57:07 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:07.632422 | orchestrator | 2026-04-20 00:57:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:07.635520 | orchestrator | 2026-04-20 00:57:07 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:57:07.638000 | orchestrator | 2026-04-20 00:57:07 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:07.640171 | orchestrator | 2026-04-20 00:57:07 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:07.640381 | orchestrator | 2026-04-20 00:57:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:10.684202 | orchestrator | 2026-04-20 00:57:10 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:10.685922 | orchestrator | 2026-04-20 00:57:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:10.688179 | orchestrator | 2026-04-20 00:57:10 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state STARTED 2026-04-20 00:57:10.689928 | orchestrator | 2026-04-20 00:57:10 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:10.691639 | orchestrator | 2026-04-20 00:57:10 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:10.691808 | orchestrator | 2026-04-20 00:57:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:13.731694 | orchestrator | 2026-04-20 00:57:13 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:13.733375 | orchestrator | 2026-04-20 00:57:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:13.734438 | orchestrator | 2026-04-20 00:57:13 | INFO  | Task 6588335a-6901-4c6d-b89f-01837568f46b is in state SUCCESS 2026-04-20 00:57:13.735823 | orchestrator | 2026-04-20 00:57:13 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:13.737447 | orchestrator | 2026-04-20 00:57:13 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:13.737506 | orchestrator | 2026-04-20 00:57:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:16.777136 | orchestrator | 2026-04-20 00:57:16 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:16.778409 | orchestrator | 2026-04-20 00:57:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:16.780655 | orchestrator | 2026-04-20 00:57:16 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:16.782673 | orchestrator | 2026-04-20 00:57:16 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:16.782904 | orchestrator | 2026-04-20 00:57:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:19.828065 | orchestrator | 2026-04-20 00:57:19 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:19.829985 | orchestrator | 2026-04-20 00:57:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:19.832153 | orchestrator | 2026-04-20 00:57:19 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:19.833440 | orchestrator | 2026-04-20 00:57:19 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:19.833622 | orchestrator | 2026-04-20 00:57:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:22.876013 | orchestrator | 2026-04-20 00:57:22 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:22.877868 | orchestrator | 2026-04-20 00:57:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:22.879911 | orchestrator | 2026-04-20 00:57:22 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:22.881442 | orchestrator | 2026-04-20 00:57:22 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:22.881845 | orchestrator | 2026-04-20 00:57:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:25.920713 | orchestrator | 2026-04-20 00:57:25 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:25.921826 | orchestrator | 2026-04-20 00:57:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:25.923729 | orchestrator | 2026-04-20 00:57:25 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:25.927037 | orchestrator | 2026-04-20 00:57:25 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:25.927132 | orchestrator | 2026-04-20 00:57:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:28.970490 | orchestrator | 2026-04-20 00:57:28 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:28.972893 | orchestrator | 2026-04-20 00:57:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:28.975392 | orchestrator | 2026-04-20 00:57:28 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:28.977018 | orchestrator | 2026-04-20 00:57:28 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:28.977065 | orchestrator | 2026-04-20 00:57:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:32.014337 | orchestrator | 2026-04-20 00:57:32 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:32.015645 | orchestrator | 2026-04-20 00:57:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:32.017269 | orchestrator | 2026-04-20 00:57:32 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:32.018584 | orchestrator | 2026-04-20 00:57:32 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:32.018627 | orchestrator | 2026-04-20 00:57:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:35.061872 | orchestrator | 2026-04-20 00:57:35 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:35.064075 | orchestrator | 2026-04-20 00:57:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:35.067386 | orchestrator | 2026-04-20 00:57:35 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:35.069731 | orchestrator | 2026-04-20 00:57:35 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:35.069822 | orchestrator | 2026-04-20 00:57:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:38.121495 | orchestrator | 2026-04-20 00:57:38 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state STARTED 2026-04-20 00:57:38.123635 | orchestrator | 2026-04-20 00:57:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:38.126095 | orchestrator | 2026-04-20 00:57:38 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state STARTED 2026-04-20 00:57:38.127913 | orchestrator | 2026-04-20 00:57:38 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state STARTED 2026-04-20 00:57:38.128134 | orchestrator | 2026-04-20 00:57:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:41.169558 | orchestrator | 2026-04-20 00:57:41 | INFO  | Task bdddc8f1-a89e-4e41-9877-32cf8c18c537 is in state SUCCESS 2026-04-20 00:57:41.171676 | orchestrator | 2026-04-20 00:57:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:41.174554 | orchestrator | 2026-04-20 00:57:41 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:41.181052 | orchestrator | 2026-04-20 00:57:41 | INFO  | Task 64dc9230-d969-439f-a4cb-821f20d6a58f is in state SUCCESS 2026-04-20 00:57:41.182107 | orchestrator | 2026-04-20 00:57:41.182164 | orchestrator | 2026-04-20 00:57:41.182174 | orchestrator | PLAY [Download ironic ipa images] ********************************************** 2026-04-20 00:57:41.182183 | orchestrator | 2026-04-20 00:57:41.182190 | orchestrator | TASK [Ensure the destination directory exists] ********************************* 2026-04-20 00:57:41.182197 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:00.098) 0:00:00.098 ********** 2026-04-20 00:57:41.182201 | orchestrator | changed: [localhost] 2026-04-20 00:57:41.182276 | orchestrator | 2026-04-20 00:57:41.182283 | orchestrator | TASK [Download ironic-agent initramfs] ***************************************** 2026-04-20 00:57:41.182287 | orchestrator | Monday 20 April 2026 00:55:47 +0000 (0:00:00.900) 0:00:00.998 ********** 2026-04-20 00:57:41.182291 | orchestrator | changed: [localhost] 2026-04-20 00:57:41.182295 | orchestrator | 2026-04-20 00:57:41.182300 | orchestrator | TASK [Download ironic-agent kernel] ******************************************** 2026-04-20 00:57:41.182304 | orchestrator | Monday 20 April 2026 00:56:22 +0000 (0:00:35.464) 0:00:36.463 ********** 2026-04-20 00:57:41.182309 | orchestrator | FAILED - RETRYING: [localhost]: Download ironic-agent kernel (3 retries left). 2026-04-20 00:57:41.182314 | orchestrator | FAILED - RETRYING: [localhost]: Download ironic-agent kernel (2 retries left). 2026-04-20 00:57:41.182317 | orchestrator | changed: [localhost] 2026-04-20 00:57:41.182321 | orchestrator | 2026-04-20 00:57:41.182325 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:57:41.182329 | orchestrator | 2026-04-20 00:57:41.182332 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:57:41.182336 | orchestrator | Monday 20 April 2026 00:57:10 +0000 (0:00:47.704) 0:01:24.168 ********** 2026-04-20 00:57:41.182361 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.182365 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.182369 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.182373 | orchestrator | 2026-04-20 00:57:41.182377 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:57:41.182382 | orchestrator | Monday 20 April 2026 00:57:10 +0000 (0:00:00.284) 0:01:24.452 ********** 2026-04-20 00:57:41.182389 | orchestrator | ok: [testbed-node-0] => (item=enable_ironic_False) 2026-04-20 00:57:41.182395 | orchestrator | ok: [testbed-node-1] => (item=enable_ironic_False) 2026-04-20 00:57:41.182401 | orchestrator | ok: [testbed-node-2] => (item=enable_ironic_False) 2026-04-20 00:57:41.182407 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: enable_ironic_True 2026-04-20 00:57:41.182413 | orchestrator | 2026-04-20 00:57:41.182418 | orchestrator | PLAY [Apply role ironic] ******************************************************* 2026-04-20 00:57:41.182423 | orchestrator | skipping: no hosts matched 2026-04-20 00:57:41.182430 | orchestrator | 2026-04-20 00:57:41.182531 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:57:41.182809 | orchestrator | localhost : ok=3  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.182842 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.182852 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.182859 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.182865 | orchestrator | 2026-04-20 00:57:41.182872 | orchestrator | 2026-04-20 00:57:41.182880 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:57:41.182887 | orchestrator | Monday 20 April 2026 00:57:11 +0000 (0:00:00.389) 0:01:24.842 ********** 2026-04-20 00:57:41.182894 | orchestrator | =============================================================================== 2026-04-20 00:57:41.182900 | orchestrator | Download ironic-agent kernel ------------------------------------------- 47.70s 2026-04-20 00:57:41.182908 | orchestrator | Download ironic-agent initramfs ---------------------------------------- 35.46s 2026-04-20 00:57:41.182914 | orchestrator | Ensure the destination directory exists --------------------------------- 0.90s 2026-04-20 00:57:41.182921 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.39s 2026-04-20 00:57:41.182928 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.28s 2026-04-20 00:57:41.182934 | orchestrator | 2026-04-20 00:57:41.182941 | orchestrator | 2026-04-20 00:57:41.182947 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:57:41.182954 | orchestrator | 2026-04-20 00:57:41.182961 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:57:41.182967 | orchestrator | Monday 20 April 2026 00:56:44 +0000 (0:00:00.272) 0:00:00.272 ********** 2026-04-20 00:57:41.182974 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.182981 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.182988 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.182995 | orchestrator | 2026-04-20 00:57:41.183001 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:57:41.183008 | orchestrator | Monday 20 April 2026 00:56:44 +0000 (0:00:00.214) 0:00:00.487 ********** 2026-04-20 00:57:41.183015 | orchestrator | ok: [testbed-node-0] => (item=enable_placement_True) 2026-04-20 00:57:41.183022 | orchestrator | ok: [testbed-node-1] => (item=enable_placement_True) 2026-04-20 00:57:41.183029 | orchestrator | ok: [testbed-node-2] => (item=enable_placement_True) 2026-04-20 00:57:41.183036 | orchestrator | 2026-04-20 00:57:41.183042 | orchestrator | PLAY [Apply role placement] **************************************************** 2026-04-20 00:57:41.183048 | orchestrator | 2026-04-20 00:57:41.183069 | orchestrator | TASK [placement : include_tasks] *********************************************** 2026-04-20 00:57:41.183076 | orchestrator | Monday 20 April 2026 00:56:44 +0000 (0:00:00.235) 0:00:00.722 ********** 2026-04-20 00:57:41.183097 | orchestrator | included: /ansible/roles/placement/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.183104 | orchestrator | 2026-04-20 00:57:41.183111 | orchestrator | TASK [service-ks-register : placement | Creating/deleting services] ************ 2026-04-20 00:57:41.183116 | orchestrator | Monday 20 April 2026 00:56:45 +0000 (0:00:00.631) 0:00:01.354 ********** 2026-04-20 00:57:41.183123 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (5 retries left). 2026-04-20 00:57:41.183129 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (4 retries left). 2026-04-20 00:57:41.183136 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (3 retries left). 2026-04-20 00:57:41.183143 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (2 retries left). 2026-04-20 00:57:41.183149 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating/deleting services (1 retries left). 2026-04-20 00:57:41.183157 | orchestrator | failed: [testbed-node-0] (item=placement (placement)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Placement Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:8780"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:8780"}], "name": "placement", "type": "placement"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:57:41.183166 | orchestrator | 2026-04-20 00:57:41.183172 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:57:41.183178 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.183184 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.183190 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:57:41.183195 | orchestrator | 2026-04-20 00:57:41.183201 | orchestrator | 2026-04-20 00:57:41.183226 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:57:41.183234 | orchestrator | Monday 20 April 2026 00:57:39 +0000 (0:00:53.632) 0:00:54.986 ********** 2026-04-20 00:57:41.183241 | orchestrator | =============================================================================== 2026-04-20 00:57:41.183247 | orchestrator | service-ks-register : placement | Creating/deleting services ----------- 53.63s 2026-04-20 00:57:41.183252 | orchestrator | placement : include_tasks ----------------------------------------------- 0.63s 2026-04-20 00:57:41.183264 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.24s 2026-04-20 00:57:41.183271 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.21s 2026-04-20 00:57:41.183601 | orchestrator | 2026-04-20 00:57:41.183623 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-20 00:57:41.183631 | orchestrator | 2.16.14 2026-04-20 00:57:41.183638 | orchestrator | 2026-04-20 00:57:41.183645 | orchestrator | PLAY [Prepare deployment of Ceph services] ************************************* 2026-04-20 00:57:41.183651 | orchestrator | 2026-04-20 00:57:41.183657 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2026-04-20 00:57:41.183665 | orchestrator | Monday 20 April 2026 00:47:04 +0000 (0:00:00.899) 0:00:00.900 ********** 2026-04-20 00:57:41.183672 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.183678 | orchestrator | 2026-04-20 00:57:41.183685 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2026-04-20 00:57:41.183702 | orchestrator | Monday 20 April 2026 00:47:05 +0000 (0:00:01.112) 0:00:02.012 ********** 2026-04-20 00:57:41.183709 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.183716 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.183723 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.183728 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.183735 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.183741 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.183748 | orchestrator | 2026-04-20 00:57:41.183754 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2026-04-20 00:57:41.183761 | orchestrator | Monday 20 April 2026 00:47:07 +0000 (0:00:01.875) 0:00:03.888 ********** 2026-04-20 00:57:41.183767 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.183773 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.183780 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.183786 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.183792 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.183799 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.183806 | orchestrator | 2026-04-20 00:57:41.183813 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2026-04-20 00:57:41.183819 | orchestrator | Monday 20 April 2026 00:47:08 +0000 (0:00:00.723) 0:00:04.612 ********** 2026-04-20 00:57:41.183826 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.183832 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.183839 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.183846 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.183852 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.183858 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.183864 | orchestrator | 2026-04-20 00:57:41.183870 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2026-04-20 00:57:41.183877 | orchestrator | Monday 20 April 2026 00:47:09 +0000 (0:00:00.799) 0:00:05.411 ********** 2026-04-20 00:57:41.183883 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.183889 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.183896 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.183902 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.183921 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.183929 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.183936 | orchestrator | 2026-04-20 00:57:41.183943 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2026-04-20 00:57:41.183950 | orchestrator | Monday 20 April 2026 00:47:09 +0000 (0:00:00.819) 0:00:06.231 ********** 2026-04-20 00:57:41.183956 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.183963 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.183969 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.183976 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.183982 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.183989 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.183995 | orchestrator | 2026-04-20 00:57:41.184002 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2026-04-20 00:57:41.184009 | orchestrator | Monday 20 April 2026 00:47:10 +0000 (0:00:00.743) 0:00:06.974 ********** 2026-04-20 00:57:41.184015 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.184022 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.184028 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.184035 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.184053 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.184067 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.184075 | orchestrator | 2026-04-20 00:57:41.184081 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2026-04-20 00:57:41.184088 | orchestrator | Monday 20 April 2026 00:47:11 +0000 (0:00:01.181) 0:00:08.156 ********** 2026-04-20 00:57:41.184094 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.184100 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.184107 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.184113 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.184129 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.184136 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.184142 | orchestrator | 2026-04-20 00:57:41.184149 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2026-04-20 00:57:41.184155 | orchestrator | Monday 20 April 2026 00:47:12 +0000 (0:00:01.005) 0:00:09.161 ********** 2026-04-20 00:57:41.184161 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.184168 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.184174 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.184180 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.184186 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.184192 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.184199 | orchestrator | 2026-04-20 00:57:41.184204 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2026-04-20 00:57:41.184235 | orchestrator | Monday 20 April 2026 00:47:13 +0000 (0:00:00.868) 0:00:10.030 ********** 2026-04-20 00:57:41.184243 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:57:41.184249 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:57:41.184257 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:57:41.184263 | orchestrator | 2026-04-20 00:57:41.184269 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2026-04-20 00:57:41.184692 | orchestrator | Monday 20 April 2026 00:47:14 +0000 (0:00:00.583) 0:00:10.614 ********** 2026-04-20 00:57:41.184713 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.184720 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.184726 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.184733 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.184738 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.184744 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.184750 | orchestrator | 2026-04-20 00:57:41.184756 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2026-04-20 00:57:41.184762 | orchestrator | Monday 20 April 2026 00:47:15 +0000 (0:00:00.875) 0:00:11.490 ********** 2026-04-20 00:57:41.184768 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:57:41.184774 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:57:41.184781 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:57:41.184787 | orchestrator | 2026-04-20 00:57:41.184793 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2026-04-20 00:57:41.184801 | orchestrator | Monday 20 April 2026 00:47:17 +0000 (0:00:02.308) 0:00:13.798 ********** 2026-04-20 00:57:41.184807 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-20 00:57:41.184814 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-20 00:57:41.184821 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-20 00:57:41.184827 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.184834 | orchestrator | 2026-04-20 00:57:41.184839 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2026-04-20 00:57:41.184845 | orchestrator | Monday 20 April 2026 00:47:17 +0000 (0:00:00.300) 0:00:14.099 ********** 2026-04-20 00:57:41.184854 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.184865 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.184871 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.184887 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.184894 | orchestrator | 2026-04-20 00:57:41.184929 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2026-04-20 00:57:41.184937 | orchestrator | Monday 20 April 2026 00:47:18 +0000 (0:00:00.663) 0:00:14.763 ********** 2026-04-20 00:57:41.184946 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.184956 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.184963 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.184969 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.184976 | orchestrator | 2026-04-20 00:57:41.184982 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2026-04-20 00:57:41.184988 | orchestrator | Monday 20 April 2026 00:47:18 +0000 (0:00:00.283) 0:00:15.046 ********** 2026-04-20 00:57:41.185002 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2026-04-20 00:47:15.845186', 'end': '2026-04-20 00:47:15.923702', 'delta': '0:00:00.078516', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.185013 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2026-04-20 00:47:16.527143', 'end': '2026-04-20 00:47:16.589579', 'delta': '0:00:00.062436', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.185020 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2026-04-20 00:47:17.249060', 'end': '2026-04-20 00:47:17.314064', 'delta': '0:00:00.065004', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.185033 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185038 | orchestrator | 2026-04-20 00:57:41.185045 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2026-04-20 00:57:41.185051 | orchestrator | Monday 20 April 2026 00:47:18 +0000 (0:00:00.156) 0:00:15.203 ********** 2026-04-20 00:57:41.185057 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.185063 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.185087 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.185095 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.185101 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.185106 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.185112 | orchestrator | 2026-04-20 00:57:41.185118 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2026-04-20 00:57:41.185124 | orchestrator | Monday 20 April 2026 00:47:21 +0000 (0:00:03.041) 0:00:18.244 ********** 2026-04-20 00:57:41.185130 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.185137 | orchestrator | 2026-04-20 00:57:41.185144 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2026-04-20 00:57:41.185150 | orchestrator | Monday 20 April 2026 00:47:22 +0000 (0:00:00.758) 0:00:19.002 ********** 2026-04-20 00:57:41.185156 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185162 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185168 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185174 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185180 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.185186 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.185192 | orchestrator | 2026-04-20 00:57:41.185197 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2026-04-20 00:57:41.185204 | orchestrator | Monday 20 April 2026 00:47:24 +0000 (0:00:01.823) 0:00:20.826 ********** 2026-04-20 00:57:41.185261 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185269 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185275 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185282 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185546 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.185557 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.185564 | orchestrator | 2026-04-20 00:57:41.185571 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-20 00:57:41.185577 | orchestrator | Monday 20 April 2026 00:47:26 +0000 (0:00:01.793) 0:00:22.620 ********** 2026-04-20 00:57:41.185584 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185591 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185597 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185603 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185609 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.185615 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.185621 | orchestrator | 2026-04-20 00:57:41.185627 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2026-04-20 00:57:41.185634 | orchestrator | Monday 20 April 2026 00:47:27 +0000 (0:00:00.885) 0:00:23.505 ********** 2026-04-20 00:57:41.185640 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185647 | orchestrator | 2026-04-20 00:57:41.185653 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2026-04-20 00:57:41.185659 | orchestrator | Monday 20 April 2026 00:47:27 +0000 (0:00:00.640) 0:00:24.145 ********** 2026-04-20 00:57:41.185666 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185673 | orchestrator | 2026-04-20 00:57:41.185679 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-20 00:57:41.185696 | orchestrator | Monday 20 April 2026 00:47:28 +0000 (0:00:00.312) 0:00:24.457 ********** 2026-04-20 00:57:41.185701 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185713 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185719 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185725 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185731 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.185736 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.185743 | orchestrator | 2026-04-20 00:57:41.185749 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2026-04-20 00:57:41.185755 | orchestrator | Monday 20 April 2026 00:47:29 +0000 (0:00:01.335) 0:00:25.793 ********** 2026-04-20 00:57:41.185760 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185767 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185772 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185778 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185784 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.185791 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.185797 | orchestrator | 2026-04-20 00:57:41.185804 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2026-04-20 00:57:41.185810 | orchestrator | Monday 20 April 2026 00:47:30 +0000 (0:00:00.983) 0:00:26.777 ********** 2026-04-20 00:57:41.185817 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185823 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185829 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185835 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185841 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.185847 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.185854 | orchestrator | 2026-04-20 00:57:41.185860 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2026-04-20 00:57:41.185866 | orchestrator | Monday 20 April 2026 00:47:31 +0000 (0:00:01.093) 0:00:27.870 ********** 2026-04-20 00:57:41.185968 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.185977 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.185983 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.185990 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.185995 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.186001 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.186006 | orchestrator | 2026-04-20 00:57:41.186335 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2026-04-20 00:57:41.186360 | orchestrator | Monday 20 April 2026 00:47:32 +0000 (0:00:01.131) 0:00:29.002 ********** 2026-04-20 00:57:41.186368 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.186374 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.186380 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.186387 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.186394 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.186400 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.186407 | orchestrator | 2026-04-20 00:57:41.186414 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2026-04-20 00:57:41.186421 | orchestrator | Monday 20 April 2026 00:47:33 +0000 (0:00:00.885) 0:00:29.887 ********** 2026-04-20 00:57:41.186427 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.186460 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.186468 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.186475 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.186481 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.186487 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.186493 | orchestrator | 2026-04-20 00:57:41.186499 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2026-04-20 00:57:41.186506 | orchestrator | Monday 20 April 2026 00:47:34 +0000 (0:00:00.812) 0:00:30.700 ********** 2026-04-20 00:57:41.186512 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.186529 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.186534 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.186540 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.186545 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.186551 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.186556 | orchestrator | 2026-04-20 00:57:41.186562 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2026-04-20 00:57:41.186568 | orchestrator | Monday 20 April 2026 00:47:34 +0000 (0:00:00.533) 0:00:31.234 ********** 2026-04-20 00:57:41.186578 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61', 'dm-uuid-LVM-IfSfGszKHiaKTTNI02Uf8MUQQ4OjiUWfiT4sKZQwvBWbStfgT02J1cBvwS0hJ5Ri'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186588 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c', 'dm-uuid-LVM-wqhPAmbCBa1EDLvJgbIeCVRnb7e8vUllW3dSX5UPeygNS001DcPoOZ2IxMimNea1'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186609 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186620 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186626 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186632 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9', 'dm-uuid-LVM-lw9HUO9cNePWfT2Pexxh0s2cnlz7QYqvJilk0DjsdoTcFqGmlXOynjtyLIMbLNxQ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186930 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186960 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561', 'dm-uuid-LVM-t2zFV9phmXPkHxXhobyelLxvV4hrZYVXxR4ps4MaDQZgEsoKDmVcziC3DbY6S7qJ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186968 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186975 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186982 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.186994 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187001 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187008 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187014 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187107 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187122 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187129 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187135 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187141 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187157 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187438 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187471 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zrnUPj-E0xj-u6GZ-IZ7t-BSHz-exTY-3U5YEc', 'scsi-0QEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11', 'scsi-SQEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187482 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zv4eBP-1KQu-zoc8-4Ks7-3EPc-TEoE-YDwG49', 'scsi-0QEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11', 'scsi-SQEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187489 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39', 'scsi-SQEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187571 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-JsgzGM-E3nq-gCWf-7fT4-ajsm-VhbE-E5ih4T', 'scsi-0QEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c', 'scsi-SQEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187584 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-47-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187591 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-PxO9K9-NxBp-CF9P-CZ2U-Mr2F-2HvG-ZOs2Yf', 'scsi-0QEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e', 'scsi-SQEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187602 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435', 'scsi-SQEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187609 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58', 'dm-uuid-LVM-oiEOoD5dVCLksAjcYkcQxq07ayCngSR6v2bKe1AtEK1XlhE9dhfgguA3x9voHqOX'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187623 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6', 'dm-uuid-LVM-X9Z1iQEwwD1G0QlSKwYXR1ueod4K8eqpghyiucE34SecLfgjufAMbWW75vBvaWlf'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187702 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187713 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187720 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187727 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187733 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187764 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187771 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187778 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187792 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187840 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187847 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187852 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-yvipd2-ylGY-cevr-TOS1-fWSQ-K3IX-2V7x97', 'scsi-0QEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560', 'scsi-SQEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187866 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187924 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-J8WRl9-vfy9-xFuV-yNo1-3fdp-WX3V-1XW9PF', 'scsi-0QEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1', 'scsi-SQEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187932 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.187937 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4', 'scsi-SQEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.187941 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188009 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188022 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-37-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188029 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.188036 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188054 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188061 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188395 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part1', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part14', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part15', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part16', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188432 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-28-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188440 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188458 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.188465 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188471 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.188477 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188481 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.188520 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188525 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188530 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188533 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188537 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188545 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188555 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188749 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part1', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part14', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part15', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part16', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188766 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188770 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-31-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188774 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.188783 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188794 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188798 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188801 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188805 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:57:41.188854 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part1', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part14', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part15', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part16', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188868 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-33-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:57:41.188880 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.188888 | orchestrator | 2026-04-20 00:57:41.188894 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2026-04-20 00:57:41.188901 | orchestrator | Monday 20 April 2026 00:47:36 +0000 (0:00:01.528) 0:00:32.762 ********** 2026-04-20 00:57:41.188909 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61', 'dm-uuid-LVM-IfSfGszKHiaKTTNI02Uf8MUQQ4OjiUWfiT4sKZQwvBWbStfgT02J1cBvwS0hJ5Ri'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.188955 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c', 'dm-uuid-LVM-wqhPAmbCBa1EDLvJgbIeCVRnb7e8vUllW3dSX5UPeygNS001DcPoOZ2IxMimNea1'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.188964 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.188972 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.188978 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.188994 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189001 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189008 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189051 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189075 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189086 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189126 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zrnUPj-E0xj-u6GZ-IZ7t-BSHz-exTY-3U5YEc', 'scsi-0QEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11', 'scsi-SQEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189134 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zv4eBP-1KQu-zoc8-4Ks7-3EPc-TEoE-YDwG49', 'scsi-0QEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11', 'scsi-SQEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189138 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39', 'scsi-SQEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189151 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-47-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189156 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9', 'dm-uuid-LVM-lw9HUO9cNePWfT2Pexxh0s2cnlz7QYqvJilk0DjsdoTcFqGmlXOynjtyLIMbLNxQ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189193 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561', 'dm-uuid-LVM-t2zFV9phmXPkHxXhobyelLxvV4hrZYVXxR4ps4MaDQZgEsoKDmVcziC3DbY6S7qJ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189451 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189468 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189491 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189503 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189510 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189516 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189637 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189649 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189656 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.189670 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189719 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-JsgzGM-E3nq-gCWf-7fT4-ajsm-VhbE-E5ih4T', 'scsi-0QEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c', 'scsi-SQEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189729 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-PxO9K9-NxBp-CF9P-CZ2U-Mr2F-2HvG-ZOs2Yf', 'scsi-0QEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e', 'scsi-SQEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189744 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435', 'scsi-SQEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189755 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189761 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58', 'dm-uuid-LVM-oiEOoD5dVCLksAjcYkcQxq07ayCngSR6v2bKe1AtEK1XlhE9dhfgguA3x9voHqOX'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189768 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.189806 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6', 'dm-uuid-LVM-X9Z1iQEwwD1G0QlSKwYXR1ueod4K8eqpghyiucE34SecLfgjufAMbWW75vBvaWlf'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189815 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189821 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189835 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189845 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189853 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189859 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189899 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189907 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189920 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189929 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189936 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189943 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.189950 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190301 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part1', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part14', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part15', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part16', 'scsi-SQEMU_QEMU_HARDDISK_0ddb617a-526c-421f-a511-7cc1055ebfef-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190361 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190372 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-28-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190445 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190453 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190466 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190470 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190513 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190523 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190536 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190543 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.190559 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-yvipd2-ylGY-cevr-TOS1-fWSQ-K3IX-2V7x97', 'scsi-0QEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560', 'scsi-SQEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190567 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190574 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190624 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190638 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-J8WRl9-vfy9-xFuV-yNo1-3fdp-WX3V-1XW9PF', 'scsi-0QEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1', 'scsi-SQEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190645 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190655 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190662 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4', 'scsi-SQEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190669 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190713 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190730 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part1', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part14', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part15', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part16', 'scsi-SQEMU_QEMU_HARDDISK_94a87711-1bba-4ac5-aa91-62925126bc5a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190737 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-31-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190744 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.190763 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190811 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-37-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190819 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190826 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190832 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.190842 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190848 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190881 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part1', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part14', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part15', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part16', 'scsi-SQEMU_QEMU_HARDDISK_abc91533-fafb-4291-911d-be538a80553e-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190895 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-33-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:57:41.190907 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.190914 | orchestrator | 2026-04-20 00:57:41.190920 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2026-04-20 00:57:41.190927 | orchestrator | Monday 20 April 2026 00:47:37 +0000 (0:00:01.380) 0:00:34.143 ********** 2026-04-20 00:57:41.190934 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.190953 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.190959 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.190966 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.190973 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.190979 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.190986 | orchestrator | 2026-04-20 00:57:41.190993 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2026-04-20 00:57:41.191000 | orchestrator | Monday 20 April 2026 00:47:39 +0000 (0:00:01.345) 0:00:35.489 ********** 2026-04-20 00:57:41.191006 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.191012 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.191019 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.191026 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.191033 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.191040 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.191046 | orchestrator | 2026-04-20 00:57:41.191053 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-20 00:57:41.191066 | orchestrator | Monday 20 April 2026 00:47:39 +0000 (0:00:00.543) 0:00:36.032 ********** 2026-04-20 00:57:41.191073 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191080 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191093 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191100 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.191106 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.191112 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.191119 | orchestrator | 2026-04-20 00:57:41.191125 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-20 00:57:41.191132 | orchestrator | Monday 20 April 2026 00:47:40 +0000 (0:00:00.731) 0:00:36.763 ********** 2026-04-20 00:57:41.191136 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191140 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191144 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191148 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.191151 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.191155 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.191159 | orchestrator | 2026-04-20 00:57:41.191162 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-20 00:57:41.191166 | orchestrator | Monday 20 April 2026 00:47:41 +0000 (0:00:00.941) 0:00:37.705 ********** 2026-04-20 00:57:41.191170 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191174 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191178 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191181 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.191185 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.191189 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.191193 | orchestrator | 2026-04-20 00:57:41.191282 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-20 00:57:41.191291 | orchestrator | Monday 20 April 2026 00:47:42 +0000 (0:00:01.252) 0:00:38.958 ********** 2026-04-20 00:57:41.191295 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191299 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191303 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191307 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.191310 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.191314 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.191318 | orchestrator | 2026-04-20 00:57:41.191323 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2026-04-20 00:57:41.191329 | orchestrator | Monday 20 April 2026 00:47:44 +0000 (0:00:01.754) 0:00:40.713 ********** 2026-04-20 00:57:41.191335 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2026-04-20 00:57:41.191343 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2026-04-20 00:57:41.191350 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2026-04-20 00:57:41.191357 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2026-04-20 00:57:41.191364 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2026-04-20 00:57:41.191370 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-20 00:57:41.191377 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-0) 2026-04-20 00:57:41.191382 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2026-04-20 00:57:41.191386 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2026-04-20 00:57:41.191390 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-0) 2026-04-20 00:57:41.191394 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2026-04-20 00:57:41.191398 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-1) 2026-04-20 00:57:41.191401 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2026-04-20 00:57:41.191405 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2026-04-20 00:57:41.191409 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-2) 2026-04-20 00:57:41.191413 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-1) 2026-04-20 00:57:41.191422 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2026-04-20 00:57:41.191426 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-2) 2026-04-20 00:57:41.191430 | orchestrator | 2026-04-20 00:57:41.191434 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2026-04-20 00:57:41.191438 | orchestrator | Monday 20 April 2026 00:47:47 +0000 (0:00:03.179) 0:00:43.892 ********** 2026-04-20 00:57:41.191442 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-20 00:57:41.191446 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-20 00:57:41.191450 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-04-20 00:57:41.191455 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-20 00:57:41.191460 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-04-20 00:57:41.191466 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-04-20 00:57:41.191472 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191484 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-04-20 00:57:41.191489 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-04-20 00:57:41.191494 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-04-20 00:57:41.191500 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191505 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:57:41.191511 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:57:41.191516 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:57:41.191521 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191528 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2026-04-20 00:57:41.191537 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2026-04-20 00:57:41.191544 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2026-04-20 00:57:41.191550 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.191556 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.191561 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2026-04-20 00:57:41.191569 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2026-04-20 00:57:41.191575 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2026-04-20 00:57:41.191581 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.191586 | orchestrator | 2026-04-20 00:57:41.191592 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2026-04-20 00:57:41.191599 | orchestrator | Monday 20 April 2026 00:47:49 +0000 (0:00:01.723) 0:00:45.616 ********** 2026-04-20 00:57:41.191604 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.191610 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.191616 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.191624 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.191630 | orchestrator | 2026-04-20 00:57:41.191636 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-04-20 00:57:41.191659 | orchestrator | Monday 20 April 2026 00:47:50 +0000 (0:00:01.671) 0:00:47.287 ********** 2026-04-20 00:57:41.191666 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191671 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191677 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191685 | orchestrator | 2026-04-20 00:57:41.191689 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-04-20 00:57:41.191693 | orchestrator | Monday 20 April 2026 00:47:51 +0000 (0:00:00.697) 0:00:47.985 ********** 2026-04-20 00:57:41.191697 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191701 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191705 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191714 | orchestrator | 2026-04-20 00:57:41.191744 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-04-20 00:57:41.191748 | orchestrator | Monday 20 April 2026 00:47:52 +0000 (0:00:00.484) 0:00:48.469 ********** 2026-04-20 00:57:41.191752 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191756 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.191760 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.191763 | orchestrator | 2026-04-20 00:57:41.191767 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-04-20 00:57:41.191771 | orchestrator | Monday 20 April 2026 00:47:52 +0000 (0:00:00.496) 0:00:48.966 ********** 2026-04-20 00:57:41.191775 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.191779 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.191783 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.191787 | orchestrator | 2026-04-20 00:57:41.191790 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-04-20 00:57:41.191794 | orchestrator | Monday 20 April 2026 00:47:53 +0000 (0:00:00.580) 0:00:49.546 ********** 2026-04-20 00:57:41.191798 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.191802 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.191805 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.191809 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191813 | orchestrator | 2026-04-20 00:57:41.191817 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-04-20 00:57:41.191820 | orchestrator | Monday 20 April 2026 00:47:53 +0000 (0:00:00.370) 0:00:49.917 ********** 2026-04-20 00:57:41.191824 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.191828 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.191832 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.191835 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191839 | orchestrator | 2026-04-20 00:57:41.191843 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-04-20 00:57:41.191847 | orchestrator | Monday 20 April 2026 00:47:54 +0000 (0:00:00.549) 0:00:50.466 ********** 2026-04-20 00:57:41.191850 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.191855 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.191858 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.191862 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.191866 | orchestrator | 2026-04-20 00:57:41.191870 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-04-20 00:57:41.191874 | orchestrator | Monday 20 April 2026 00:47:54 +0000 (0:00:00.510) 0:00:50.976 ********** 2026-04-20 00:57:41.191878 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.191881 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.191885 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.191889 | orchestrator | 2026-04-20 00:57:41.191893 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-04-20 00:57:41.191896 | orchestrator | Monday 20 April 2026 00:47:55 +0000 (0:00:00.783) 0:00:51.760 ********** 2026-04-20 00:57:41.191904 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-04-20 00:57:41.191908 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-20 00:57:41.191912 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-20 00:57:41.191916 | orchestrator | 2026-04-20 00:57:41.191921 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2026-04-20 00:57:41.191925 | orchestrator | Monday 20 April 2026 00:47:56 +0000 (0:00:01.019) 0:00:52.779 ********** 2026-04-20 00:57:41.191930 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:57:41.191935 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:57:41.191939 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:57:41.191947 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-20 00:57:41.191952 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-20 00:57:41.191956 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-20 00:57:41.191960 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-20 00:57:41.191964 | orchestrator | 2026-04-20 00:57:41.191968 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2026-04-20 00:57:41.191973 | orchestrator | Monday 20 April 2026 00:47:57 +0000 (0:00:01.002) 0:00:53.782 ********** 2026-04-20 00:57:41.191977 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:57:41.191981 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:57:41.191985 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:57:41.191990 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-20 00:57:41.191994 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-20 00:57:41.191998 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-20 00:57:41.192002 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-20 00:57:41.192006 | orchestrator | 2026-04-20 00:57:41.192010 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.192014 | orchestrator | Monday 20 April 2026 00:48:00 +0000 (0:00:03.395) 0:00:57.177 ********** 2026-04-20 00:57:41.192032 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.192039 | orchestrator | 2026-04-20 00:57:41.192045 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.192051 | orchestrator | Monday 20 April 2026 00:48:02 +0000 (0:00:01.509) 0:00:58.687 ********** 2026-04-20 00:57:41.192057 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-1, testbed-node-0, testbed-node-2 2026-04-20 00:57:41.192063 | orchestrator | 2026-04-20 00:57:41.192070 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.192077 | orchestrator | Monday 20 April 2026 00:48:03 +0000 (0:00:01.562) 0:01:00.250 ********** 2026-04-20 00:57:41.192084 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192090 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192097 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192104 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.192111 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.192117 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.192124 | orchestrator | 2026-04-20 00:57:41.192130 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.192137 | orchestrator | Monday 20 April 2026 00:48:05 +0000 (0:00:01.736) 0:01:01.986 ********** 2026-04-20 00:57:41.192144 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192151 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192157 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192163 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192170 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192176 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192182 | orchestrator | 2026-04-20 00:57:41.192189 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.192195 | orchestrator | Monday 20 April 2026 00:48:06 +0000 (0:00:01.232) 0:01:03.218 ********** 2026-04-20 00:57:41.192202 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192294 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192314 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192320 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192327 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192334 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192340 | orchestrator | 2026-04-20 00:57:41.192347 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.192353 | orchestrator | Monday 20 April 2026 00:48:07 +0000 (0:00:00.794) 0:01:04.012 ********** 2026-04-20 00:57:41.192360 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192366 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192373 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192379 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192385 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192391 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192397 | orchestrator | 2026-04-20 00:57:41.192403 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.192410 | orchestrator | Monday 20 April 2026 00:48:08 +0000 (0:00:01.107) 0:01:05.120 ********** 2026-04-20 00:57:41.192416 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192423 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192435 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192441 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.192447 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.192453 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.192459 | orchestrator | 2026-04-20 00:57:41.192466 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.192473 | orchestrator | Monday 20 April 2026 00:48:09 +0000 (0:00:00.969) 0:01:06.089 ********** 2026-04-20 00:57:41.192479 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192485 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192490 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192496 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192502 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192508 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192514 | orchestrator | 2026-04-20 00:57:41.192520 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.192526 | orchestrator | Monday 20 April 2026 00:48:10 +0000 (0:00:01.019) 0:01:07.109 ********** 2026-04-20 00:57:41.192532 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192538 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192544 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192550 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192557 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192562 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192569 | orchestrator | 2026-04-20 00:57:41.192576 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.192582 | orchestrator | Monday 20 April 2026 00:48:11 +0000 (0:00:00.754) 0:01:07.863 ********** 2026-04-20 00:57:41.192588 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192594 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192600 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192606 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.192612 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.192618 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.192624 | orchestrator | 2026-04-20 00:57:41.192630 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.192636 | orchestrator | Monday 20 April 2026 00:48:12 +0000 (0:00:01.241) 0:01:09.105 ********** 2026-04-20 00:57:41.192643 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192649 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192656 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192662 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.192668 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.192675 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.192688 | orchestrator | 2026-04-20 00:57:41.192694 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.192701 | orchestrator | Monday 20 April 2026 00:48:13 +0000 (0:00:01.110) 0:01:10.216 ********** 2026-04-20 00:57:41.192705 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192709 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192713 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192717 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192720 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192763 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192767 | orchestrator | 2026-04-20 00:57:41.192771 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.192775 | orchestrator | Monday 20 April 2026 00:48:14 +0000 (0:00:00.663) 0:01:10.880 ********** 2026-04-20 00:57:41.192779 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192783 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192786 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192790 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.192794 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.192798 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.192801 | orchestrator | 2026-04-20 00:57:41.192805 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.192809 | orchestrator | Monday 20 April 2026 00:48:15 +0000 (0:00:00.659) 0:01:11.539 ********** 2026-04-20 00:57:41.192813 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192817 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192821 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192825 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192829 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192833 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192837 | orchestrator | 2026-04-20 00:57:41.192840 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.192844 | orchestrator | Monday 20 April 2026 00:48:16 +0000 (0:00:01.030) 0:01:12.570 ********** 2026-04-20 00:57:41.192848 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192852 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192855 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192859 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192863 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192867 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192871 | orchestrator | 2026-04-20 00:57:41.192875 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.192878 | orchestrator | Monday 20 April 2026 00:48:16 +0000 (0:00:00.671) 0:01:13.242 ********** 2026-04-20 00:57:41.192882 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.192886 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.192890 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.192894 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192897 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192901 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192905 | orchestrator | 2026-04-20 00:57:41.192909 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.192912 | orchestrator | Monday 20 April 2026 00:48:17 +0000 (0:00:00.645) 0:01:13.888 ********** 2026-04-20 00:57:41.192916 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192920 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192924 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192929 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.192935 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.192941 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.192947 | orchestrator | 2026-04-20 00:57:41.192954 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.192960 | orchestrator | Monday 20 April 2026 00:48:18 +0000 (0:00:00.501) 0:01:14.390 ********** 2026-04-20 00:57:41.192973 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.192985 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.192992 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.192998 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.193004 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.193010 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.193016 | orchestrator | 2026-04-20 00:57:41.193022 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.193029 | orchestrator | Monday 20 April 2026 00:48:18 +0000 (0:00:00.670) 0:01:15.060 ********** 2026-04-20 00:57:41.193035 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.193041 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.193045 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.193048 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.193052 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.193058 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.193065 | orchestrator | 2026-04-20 00:57:41.193071 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.193077 | orchestrator | Monday 20 April 2026 00:48:19 +0000 (0:00:00.597) 0:01:15.657 ********** 2026-04-20 00:57:41.193083 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.193089 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.193095 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.193102 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.193108 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.193114 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.193119 | orchestrator | 2026-04-20 00:57:41.193125 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.193130 | orchestrator | Monday 20 April 2026 00:48:19 +0000 (0:00:00.683) 0:01:16.341 ********** 2026-04-20 00:57:41.193136 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.193142 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.193148 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.193153 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.193159 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.193165 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.193171 | orchestrator | 2026-04-20 00:57:41.193177 | orchestrator | TASK [ceph-container-common : Generate systemd ceph target file] *************** 2026-04-20 00:57:41.193184 | orchestrator | Monday 20 April 2026 00:48:20 +0000 (0:00:00.929) 0:01:17.270 ********** 2026-04-20 00:57:41.193190 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.193196 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.193201 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.193228 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.193235 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.193243 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.193248 | orchestrator | 2026-04-20 00:57:41.193254 | orchestrator | TASK [ceph-container-common : Enable ceph.target] ****************************** 2026-04-20 00:57:41.193261 | orchestrator | Monday 20 April 2026 00:48:22 +0000 (0:00:02.000) 0:01:19.271 ********** 2026-04-20 00:57:41.193267 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.193273 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.193280 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.193286 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.193332 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.193340 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.193346 | orchestrator | 2026-04-20 00:57:41.193353 | orchestrator | TASK [ceph-container-common : Include prerequisites.yml] *********************** 2026-04-20 00:57:41.193360 | orchestrator | Monday 20 April 2026 00:48:25 +0000 (0:00:02.166) 0:01:21.437 ********** 2026-04-20 00:57:41.193367 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/prerequisites.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.193374 | orchestrator | 2026-04-20 00:57:41.193380 | orchestrator | TASK [ceph-container-common : Stop lvmetad] ************************************ 2026-04-20 00:57:41.193398 | orchestrator | Monday 20 April 2026 00:48:26 +0000 (0:00:01.128) 0:01:22.565 ********** 2026-04-20 00:57:41.193404 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.193410 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.193416 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.193422 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.193428 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.193434 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.193440 | orchestrator | 2026-04-20 00:57:41.193446 | orchestrator | TASK [ceph-container-common : Disable and mask lvmetad service] **************** 2026-04-20 00:57:41.193453 | orchestrator | Monday 20 April 2026 00:48:26 +0000 (0:00:00.764) 0:01:23.330 ********** 2026-04-20 00:57:41.193459 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.193466 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.193472 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.193477 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.193483 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.193489 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.193495 | orchestrator | 2026-04-20 00:57:41.193500 | orchestrator | TASK [ceph-container-common : Remove ceph udev rules] ************************** 2026-04-20 00:57:41.193506 | orchestrator | Monday 20 April 2026 00:48:27 +0000 (0:00:00.951) 0:01:24.281 ********** 2026-04-20 00:57:41.193513 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-20 00:57:41.193519 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-20 00:57:41.193525 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-20 00:57:41.193531 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-20 00:57:41.193538 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-20 00:57:41.193546 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-20 00:57:41.193552 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2026-04-20 00:57:41.193559 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-20 00:57:41.193565 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-20 00:57:41.193579 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-20 00:57:41.193586 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-20 00:57:41.193592 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2026-04-20 00:57:41.193597 | orchestrator | 2026-04-20 00:57:41.193603 | orchestrator | TASK [ceph-container-common : Ensure tmpfiles.d is present] ******************** 2026-04-20 00:57:41.193610 | orchestrator | Monday 20 April 2026 00:48:29 +0000 (0:00:01.773) 0:01:26.055 ********** 2026-04-20 00:57:41.193616 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.193622 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.193628 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.193634 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.193640 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.193647 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.193653 | orchestrator | 2026-04-20 00:57:41.193660 | orchestrator | TASK [ceph-container-common : Restore certificates selinux context] ************ 2026-04-20 00:57:41.193666 | orchestrator | Monday 20 April 2026 00:48:30 +0000 (0:00:00.992) 0:01:27.048 ********** 2026-04-20 00:57:41.193673 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.193679 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.193687 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.193694 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.193700 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.193715 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.193722 | orchestrator | 2026-04-20 00:57:41.193729 | orchestrator | TASK [ceph-container-common : Install python3 on osd nodes] ******************** 2026-04-20 00:57:41.193736 | orchestrator | Monday 20 April 2026 00:48:31 +0000 (0:00:01.114) 0:01:28.162 ********** 2026-04-20 00:57:41.193742 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.193749 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.193756 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.193763 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.193769 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.193776 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.193782 | orchestrator | 2026-04-20 00:57:41.193788 | orchestrator | TASK [ceph-container-common : Include registry.yml] **************************** 2026-04-20 00:57:41.193795 | orchestrator | Monday 20 April 2026 00:48:32 +0000 (0:00:00.531) 0:01:28.694 ********** 2026-04-20 00:57:41.193801 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.193809 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.193815 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.193821 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.193828 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.193835 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.193841 | orchestrator | 2026-04-20 00:57:41.193848 | orchestrator | TASK [ceph-container-common : Include fetch_image.yml] ************************* 2026-04-20 00:57:41.193919 | orchestrator | Monday 20 April 2026 00:48:33 +0000 (0:00:00.730) 0:01:29.424 ********** 2026-04-20 00:57:41.193930 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/fetch_image.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.193938 | orchestrator | 2026-04-20 00:57:41.193944 | orchestrator | TASK [ceph-container-common : Pulling Ceph container image] ******************** 2026-04-20 00:57:41.193951 | orchestrator | Monday 20 April 2026 00:48:34 +0000 (0:00:01.139) 0:01:30.563 ********** 2026-04-20 00:57:41.193957 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.193964 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.193970 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.193977 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.193983 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.193990 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.193997 | orchestrator | 2026-04-20 00:57:41.194003 | orchestrator | TASK [ceph-container-common : Pulling alertmanager/prometheus/grafana container images] *** 2026-04-20 00:57:41.194010 | orchestrator | Monday 20 April 2026 00:49:19 +0000 (0:00:44.964) 0:02:15.527 ********** 2026-04-20 00:57:41.194055 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-20 00:57:41.194062 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-20 00:57:41.194069 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-20 00:57:41.194076 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194082 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-20 00:57:41.194089 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-20 00:57:41.194096 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-20 00:57:41.194103 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194110 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-20 00:57:41.194117 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-20 00:57:41.194123 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-20 00:57:41.194130 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194138 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-20 00:57:41.194145 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-20 00:57:41.194158 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-20 00:57:41.194165 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194172 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-20 00:57:41.194179 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-20 00:57:41.194186 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-20 00:57:41.194200 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194256 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/alertmanager:v0.16.2)  2026-04-20 00:57:41.194266 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/prometheus:v2.7.2)  2026-04-20 00:57:41.194272 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/grafana/grafana:6.7.4)  2026-04-20 00:57:41.194278 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194284 | orchestrator | 2026-04-20 00:57:41.194291 | orchestrator | TASK [ceph-container-common : Pulling node-exporter container image] *********** 2026-04-20 00:57:41.194297 | orchestrator | Monday 20 April 2026 00:49:20 +0000 (0:00:00.953) 0:02:16.481 ********** 2026-04-20 00:57:41.194304 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194310 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194316 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194322 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194329 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194335 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194342 | orchestrator | 2026-04-20 00:57:41.194348 | orchestrator | TASK [ceph-container-common : Export local ceph dev image] ********************* 2026-04-20 00:57:41.194355 | orchestrator | Monday 20 April 2026 00:49:20 +0000 (0:00:00.590) 0:02:17.072 ********** 2026-04-20 00:57:41.194361 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194367 | orchestrator | 2026-04-20 00:57:41.194374 | orchestrator | TASK [ceph-container-common : Copy ceph dev image file] ************************ 2026-04-20 00:57:41.194380 | orchestrator | Monday 20 April 2026 00:49:20 +0000 (0:00:00.142) 0:02:17.214 ********** 2026-04-20 00:57:41.194386 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194392 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194398 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194405 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194411 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194417 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194423 | orchestrator | 2026-04-20 00:57:41.194429 | orchestrator | TASK [ceph-container-common : Load ceph dev image] ***************************** 2026-04-20 00:57:41.194436 | orchestrator | Monday 20 April 2026 00:49:21 +0000 (0:00:00.921) 0:02:18.136 ********** 2026-04-20 00:57:41.194442 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194449 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194455 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194462 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194468 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194474 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194481 | orchestrator | 2026-04-20 00:57:41.194487 | orchestrator | TASK [ceph-container-common : Remove tmp ceph dev image file] ****************** 2026-04-20 00:57:41.194494 | orchestrator | Monday 20 April 2026 00:49:22 +0000 (0:00:00.638) 0:02:18.774 ********** 2026-04-20 00:57:41.194500 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194506 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194512 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194553 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194562 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194568 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194575 | orchestrator | 2026-04-20 00:57:41.194582 | orchestrator | TASK [ceph-container-common : Get ceph version] ******************************** 2026-04-20 00:57:41.194595 | orchestrator | Monday 20 April 2026 00:49:23 +0000 (0:00:00.934) 0:02:19.709 ********** 2026-04-20 00:57:41.194601 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.194607 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.194614 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.194620 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.194627 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.194633 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.194639 | orchestrator | 2026-04-20 00:57:41.194646 | orchestrator | TASK [ceph-container-common : Set_fact ceph_version ceph_version.stdout.split] *** 2026-04-20 00:57:41.194653 | orchestrator | Monday 20 April 2026 00:49:26 +0000 (0:00:03.207) 0:02:22.917 ********** 2026-04-20 00:57:41.194660 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.194666 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.194672 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.194678 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.194684 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.194690 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.194695 | orchestrator | 2026-04-20 00:57:41.194702 | orchestrator | TASK [ceph-container-common : Include release.yml] ***************************** 2026-04-20 00:57:41.194708 | orchestrator | Monday 20 April 2026 00:49:27 +0000 (0:00:00.924) 0:02:23.841 ********** 2026-04-20 00:57:41.194716 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/release.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.194724 | orchestrator | 2026-04-20 00:57:41.194730 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release jewel] ********************* 2026-04-20 00:57:41.194736 | orchestrator | Monday 20 April 2026 00:49:28 +0000 (0:00:01.282) 0:02:25.123 ********** 2026-04-20 00:57:41.194742 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194747 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194753 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194759 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194764 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194770 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194775 | orchestrator | 2026-04-20 00:57:41.194781 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release kraken] ******************** 2026-04-20 00:57:41.194788 | orchestrator | Monday 20 April 2026 00:49:29 +0000 (0:00:00.633) 0:02:25.757 ********** 2026-04-20 00:57:41.194794 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194800 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194806 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194812 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194818 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194824 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194829 | orchestrator | 2026-04-20 00:57:41.194836 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release luminous] ****************** 2026-04-20 00:57:41.194842 | orchestrator | Monday 20 April 2026 00:49:30 +0000 (0:00:00.952) 0:02:26.709 ********** 2026-04-20 00:57:41.194855 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194860 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194866 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194871 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194876 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194881 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194887 | orchestrator | 2026-04-20 00:57:41.194892 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release mimic] ********************* 2026-04-20 00:57:41.194898 | orchestrator | Monday 20 April 2026 00:49:30 +0000 (0:00:00.628) 0:02:27.337 ********** 2026-04-20 00:57:41.194904 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194910 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194915 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194920 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194927 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194939 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194944 | orchestrator | 2026-04-20 00:57:41.194947 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release nautilus] ****************** 2026-04-20 00:57:41.194951 | orchestrator | Monday 20 April 2026 00:49:31 +0000 (0:00:00.860) 0:02:28.198 ********** 2026-04-20 00:57:41.194955 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194958 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194962 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.194966 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.194970 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.194973 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.194977 | orchestrator | 2026-04-20 00:57:41.194981 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release octopus] ******************* 2026-04-20 00:57:41.194985 | orchestrator | Monday 20 April 2026 00:49:32 +0000 (0:00:00.645) 0:02:28.844 ********** 2026-04-20 00:57:41.194989 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.194992 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.194996 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.195000 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.195003 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.195007 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.195011 | orchestrator | 2026-04-20 00:57:41.195014 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release pacific] ******************* 2026-04-20 00:57:41.195018 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:00.896) 0:02:29.740 ********** 2026-04-20 00:57:41.195022 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.195025 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.195029 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.195033 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.195037 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.195040 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.195044 | orchestrator | 2026-04-20 00:57:41.195048 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release quincy] ******************** 2026-04-20 00:57:41.195052 | orchestrator | Monday 20 April 2026 00:49:33 +0000 (0:00:00.629) 0:02:30.370 ********** 2026-04-20 00:57:41.195055 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.195090 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.195097 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.195104 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.195114 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.195121 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.195126 | orchestrator | 2026-04-20 00:57:41.195132 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release reef] ********************** 2026-04-20 00:57:41.195138 | orchestrator | Monday 20 April 2026 00:49:35 +0000 (0:00:01.207) 0:02:31.577 ********** 2026-04-20 00:57:41.195144 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.195149 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.195155 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.195161 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.195167 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.195172 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.195178 | orchestrator | 2026-04-20 00:57:41.195184 | orchestrator | TASK [ceph-config : Include create_ceph_initial_dirs.yml] ********************** 2026-04-20 00:57:41.195190 | orchestrator | Monday 20 April 2026 00:49:36 +0000 (0:00:01.232) 0:02:32.810 ********** 2026-04-20 00:57:41.195197 | orchestrator | included: /ansible/roles/ceph-config/tasks/create_ceph_initial_dirs.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.195204 | orchestrator | 2026-04-20 00:57:41.195240 | orchestrator | TASK [ceph-config : Create ceph initial directories] *************************** 2026-04-20 00:57:41.195246 | orchestrator | Monday 20 April 2026 00:49:37 +0000 (0:00:01.293) 0:02:34.103 ********** 2026-04-20 00:57:41.195252 | orchestrator | changed: [testbed-node-3] => (item=/etc/ceph) 2026-04-20 00:57:41.195267 | orchestrator | changed: [testbed-node-4] => (item=/etc/ceph) 2026-04-20 00:57:41.195273 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/) 2026-04-20 00:57:41.195280 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/) 2026-04-20 00:57:41.195286 | orchestrator | changed: [testbed-node-5] => (item=/etc/ceph) 2026-04-20 00:57:41.195292 | orchestrator | changed: [testbed-node-0] => (item=/etc/ceph) 2026-04-20 00:57:41.195299 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mon) 2026-04-20 00:57:41.195303 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mon) 2026-04-20 00:57:41.195307 | orchestrator | changed: [testbed-node-2] => (item=/etc/ceph) 2026-04-20 00:57:41.195311 | orchestrator | changed: [testbed-node-1] => (item=/etc/ceph) 2026-04-20 00:57:41.195315 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/) 2026-04-20 00:57:41.195318 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/osd) 2026-04-20 00:57:41.195322 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/osd) 2026-04-20 00:57:41.195326 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/) 2026-04-20 00:57:41.195330 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/) 2026-04-20 00:57:41.195334 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/) 2026-04-20 00:57:41.195338 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mon) 2026-04-20 00:57:41.195347 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds) 2026-04-20 00:57:41.195352 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds) 2026-04-20 00:57:41.195355 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mon) 2026-04-20 00:57:41.195359 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mon) 2026-04-20 00:57:41.195363 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mon) 2026-04-20 00:57:41.195368 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/osd) 2026-04-20 00:57:41.195374 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/tmp) 2026-04-20 00:57:41.195380 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/osd) 2026-04-20 00:57:41.195385 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/osd) 2026-04-20 00:57:41.195393 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/tmp) 2026-04-20 00:57:41.195402 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/osd) 2026-04-20 00:57:41.195407 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mds) 2026-04-20 00:57:41.195413 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/crash) 2026-04-20 00:57:41.195418 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mds) 2026-04-20 00:57:41.195424 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds) 2026-04-20 00:57:41.195429 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/crash) 2026-04-20 00:57:41.195435 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mds) 2026-04-20 00:57:41.195441 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/tmp) 2026-04-20 00:57:41.195446 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/radosgw) 2026-04-20 00:57:41.195452 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/tmp) 2026-04-20 00:57:41.195458 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/tmp) 2026-04-20 00:57:41.195463 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/radosgw) 2026-04-20 00:57:41.195468 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/tmp) 2026-04-20 00:57:41.195475 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/crash) 2026-04-20 00:57:41.195482 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-20 00:57:41.195489 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/crash) 2026-04-20 00:57:41.195495 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/crash) 2026-04-20 00:57:41.195501 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/crash) 2026-04-20 00:57:41.195538 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-20 00:57:41.195545 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/radosgw) 2026-04-20 00:57:41.195587 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/radosgw) 2026-04-20 00:57:41.195594 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-20 00:57:41.195601 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/radosgw) 2026-04-20 00:57:41.195607 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/radosgw) 2026-04-20 00:57:41.195613 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-20 00:57:41.195619 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-20 00:57:41.195625 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-20 00:57:41.195630 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-20 00:57:41.195637 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-20 00:57:41.195643 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rgw) 2026-04-20 00:57:41.195649 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-20 00:57:41.195656 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-20 00:57:41.195662 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-20 00:57:41.195668 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-20 00:57:41.195674 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-20 00:57:41.195681 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-20 00:57:41.195687 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mgr) 2026-04-20 00:57:41.195693 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-20 00:57:41.195700 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-20 00:57:41.195706 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-20 00:57:41.195712 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-20 00:57:41.195718 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds) 2026-04-20 00:57:41.195724 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-20 00:57:41.195730 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-20 00:57:41.195736 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-20 00:57:41.195742 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-20 00:57:41.195749 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-20 00:57:41.195755 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-osd) 2026-04-20 00:57:41.195766 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-20 00:57:41.195773 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-20 00:57:41.195779 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-20 00:57:41.195785 | orchestrator | changed: [testbed-node-3] => (item=/var/run/ceph) 2026-04-20 00:57:41.195791 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-20 00:57:41.195797 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd) 2026-04-20 00:57:41.195803 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-20 00:57:41.195809 | orchestrator | changed: [testbed-node-4] => (item=/var/run/ceph) 2026-04-20 00:57:41.195816 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-20 00:57:41.195822 | orchestrator | changed: [testbed-node-3] => (item=/var/log/ceph) 2026-04-20 00:57:41.195834 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-20 00:57:41.195840 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2026-04-20 00:57:41.195846 | orchestrator | changed: [testbed-node-0] => (item=/var/run/ceph) 2026-04-20 00:57:41.195853 | orchestrator | changed: [testbed-node-2] => (item=/var/run/ceph) 2026-04-20 00:57:41.195859 | orchestrator | changed: [testbed-node-4] => (item=/var/log/ceph) 2026-04-20 00:57:41.195865 | orchestrator | changed: [testbed-node-5] => (item=/var/run/ceph) 2026-04-20 00:57:41.195871 | orchestrator | changed: [testbed-node-1] => (item=/var/run/ceph) 2026-04-20 00:57:41.195877 | orchestrator | changed: [testbed-node-0] => (item=/var/log/ceph) 2026-04-20 00:57:41.195883 | orchestrator | changed: [testbed-node-2] => (item=/var/log/ceph) 2026-04-20 00:57:41.195889 | orchestrator | changed: [testbed-node-5] => (item=/var/log/ceph) 2026-04-20 00:57:41.195894 | orchestrator | changed: [testbed-node-1] => (item=/var/log/ceph) 2026-04-20 00:57:41.195900 | orchestrator | 2026-04-20 00:57:41.195906 | orchestrator | TASK [ceph-config : Include_tasks rgw_systemd_environment_file.yml] ************ 2026-04-20 00:57:41.195912 | orchestrator | Monday 20 April 2026 00:49:44 +0000 (0:00:07.046) 0:02:41.150 ********** 2026-04-20 00:57:41.195918 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.195924 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.195930 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.195937 | orchestrator | included: /ansible/roles/ceph-config/tasks/rgw_systemd_environment_file.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.195945 | orchestrator | 2026-04-20 00:57:41.195952 | orchestrator | TASK [ceph-config : Create rados gateway instance directories] ***************** 2026-04-20 00:57:41.195979 | orchestrator | Monday 20 April 2026 00:49:45 +0000 (0:00:00.865) 0:02:42.015 ********** 2026-04-20 00:57:41.195985 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.195991 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.195997 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196003 | orchestrator | 2026-04-20 00:57:41.196009 | orchestrator | TASK [ceph-config : Generate environment file] ********************************* 2026-04-20 00:57:41.196016 | orchestrator | Monday 20 April 2026 00:49:46 +0000 (0:00:00.678) 0:02:42.694 ********** 2026-04-20 00:57:41.196022 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196028 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196035 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196041 | orchestrator | 2026-04-20 00:57:41.196047 | orchestrator | TASK [ceph-config : Reset num_osds] ******************************************** 2026-04-20 00:57:41.196053 | orchestrator | Monday 20 April 2026 00:49:47 +0000 (0:00:01.280) 0:02:43.974 ********** 2026-04-20 00:57:41.196060 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.196066 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.196072 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.196078 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196084 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196090 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196096 | orchestrator | 2026-04-20 00:57:41.196103 | orchestrator | TASK [ceph-config : Count number of osds for lvm scenario] ********************* 2026-04-20 00:57:41.196109 | orchestrator | Monday 20 April 2026 00:49:48 +0000 (0:00:00.867) 0:02:44.841 ********** 2026-04-20 00:57:41.196115 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.196126 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.196133 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.196139 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196146 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196152 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196158 | orchestrator | 2026-04-20 00:57:41.196164 | orchestrator | TASK [ceph-config : Look up for ceph-volume rejected devices] ****************** 2026-04-20 00:57:41.196171 | orchestrator | Monday 20 April 2026 00:49:49 +0000 (0:00:00.713) 0:02:45.555 ********** 2026-04-20 00:57:41.196177 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196184 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196190 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196196 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196202 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196232 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196238 | orchestrator | 2026-04-20 00:57:41.196245 | orchestrator | TASK [ceph-config : Set_fact rejected_devices] ********************************* 2026-04-20 00:57:41.196252 | orchestrator | Monday 20 April 2026 00:49:50 +0000 (0:00:00.823) 0:02:46.378 ********** 2026-04-20 00:57:41.196258 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196264 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196270 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196277 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196283 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196289 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196294 | orchestrator | 2026-04-20 00:57:41.196300 | orchestrator | TASK [ceph-config : Set_fact _devices] ***************************************** 2026-04-20 00:57:41.196306 | orchestrator | Monday 20 April 2026 00:49:50 +0000 (0:00:00.660) 0:02:47.039 ********** 2026-04-20 00:57:41.196312 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196318 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196325 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196331 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196337 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196344 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196349 | orchestrator | 2026-04-20 00:57:41.196357 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm batch --report' to see how many osds are to be created] *** 2026-04-20 00:57:41.196363 | orchestrator | Monday 20 April 2026 00:49:51 +0000 (0:00:00.911) 0:02:47.950 ********** 2026-04-20 00:57:41.196369 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196376 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196382 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196387 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196393 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196400 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196406 | orchestrator | 2026-04-20 00:57:41.196412 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (legacy report)] *** 2026-04-20 00:57:41.196418 | orchestrator | Monday 20 April 2026 00:49:52 +0000 (0:00:00.636) 0:02:48.587 ********** 2026-04-20 00:57:41.196424 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196430 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196436 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196442 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196449 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196455 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196461 | orchestrator | 2026-04-20 00:57:41.196468 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (new report)] *** 2026-04-20 00:57:41.196475 | orchestrator | Monday 20 April 2026 00:49:53 +0000 (0:00:00.797) 0:02:49.384 ********** 2026-04-20 00:57:41.196481 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196488 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196500 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196532 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196540 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196546 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196552 | orchestrator | 2026-04-20 00:57:41.196559 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created] *** 2026-04-20 00:57:41.196565 | orchestrator | Monday 20 April 2026 00:49:53 +0000 (0:00:00.741) 0:02:50.126 ********** 2026-04-20 00:57:41.196571 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196578 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196584 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196590 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.196596 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.196602 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.196609 | orchestrator | 2026-04-20 00:57:41.196615 | orchestrator | TASK [ceph-config : Set_fact num_osds (add existing osds)] ********************* 2026-04-20 00:57:41.196621 | orchestrator | Monday 20 April 2026 00:49:57 +0000 (0:00:03.453) 0:02:53.579 ********** 2026-04-20 00:57:41.196627 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.196634 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.196640 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.196646 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196652 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196658 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196664 | orchestrator | 2026-04-20 00:57:41.196670 | orchestrator | TASK [ceph-config : Set_fact _osd_memory_target] ******************************* 2026-04-20 00:57:41.196677 | orchestrator | Monday 20 April 2026 00:49:57 +0000 (0:00:00.559) 0:02:54.138 ********** 2026-04-20 00:57:41.196683 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.196690 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.196696 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.196717 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196723 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196729 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196736 | orchestrator | 2026-04-20 00:57:41.196742 | orchestrator | TASK [ceph-config : Set osd_memory_target to cluster host config] ************** 2026-04-20 00:57:41.196748 | orchestrator | Monday 20 April 2026 00:49:58 +0000 (0:00:00.607) 0:02:54.746 ********** 2026-04-20 00:57:41.196755 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196761 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196767 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196773 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196779 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196785 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196791 | orchestrator | 2026-04-20 00:57:41.196797 | orchestrator | TASK [ceph-config : Render rgw configs] **************************************** 2026-04-20 00:57:41.196804 | orchestrator | Monday 20 April 2026 00:49:59 +0000 (0:00:00.672) 0:02:55.419 ********** 2026-04-20 00:57:41.196810 | orchestrator | ok: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196817 | orchestrator | ok: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196828 | orchestrator | ok: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.196834 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196841 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196847 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196853 | orchestrator | 2026-04-20 00:57:41.196859 | orchestrator | TASK [ceph-config : Set config to cluster] ************************************* 2026-04-20 00:57:41.196866 | orchestrator | Monday 20 April 2026 00:49:59 +0000 (0:00:00.504) 0:02:55.923 ********** 2026-04-20 00:57:41.196875 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log'}])  2026-04-20 00:57:41.196890 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.13:8081'}])  2026-04-20 00:57:41.196899 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196904 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log'}])  2026-04-20 00:57:41.196910 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.14:8081'}])  2026-04-20 00:57:41.196916 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.196942 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log'}])  2026-04-20 00:57:41.196950 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.15:8081'}])  2026-04-20 00:57:41.196956 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.196962 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.196967 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.196973 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.196978 | orchestrator | 2026-04-20 00:57:41.196983 | orchestrator | TASK [ceph-config : Set rgw configs to file] *********************************** 2026-04-20 00:57:41.196989 | orchestrator | Monday 20 April 2026 00:50:00 +0000 (0:00:00.903) 0:02:56.826 ********** 2026-04-20 00:57:41.196994 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.196999 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197004 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197010 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197016 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197023 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197028 | orchestrator | 2026-04-20 00:57:41.197034 | orchestrator | TASK [ceph-config : Create ceph conf directory] ******************************** 2026-04-20 00:57:41.197040 | orchestrator | Monday 20 April 2026 00:50:01 +0000 (0:00:00.563) 0:02:57.389 ********** 2026-04-20 00:57:41.197046 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197051 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197057 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197062 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197068 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197075 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197080 | orchestrator | 2026-04-20 00:57:41.197086 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-04-20 00:57:41.197093 | orchestrator | Monday 20 April 2026 00:50:01 +0000 (0:00:00.879) 0:02:58.269 ********** 2026-04-20 00:57:41.197099 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197105 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197117 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197123 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197129 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197134 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197140 | orchestrator | 2026-04-20 00:57:41.197146 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-04-20 00:57:41.197151 | orchestrator | Monday 20 April 2026 00:50:02 +0000 (0:00:00.609) 0:02:58.878 ********** 2026-04-20 00:57:41.197157 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197163 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197169 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197175 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197181 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197187 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197192 | orchestrator | 2026-04-20 00:57:41.197205 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-04-20 00:57:41.197259 | orchestrator | Monday 20 April 2026 00:50:03 +0000 (0:00:00.838) 0:02:59.717 ********** 2026-04-20 00:57:41.197266 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197272 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197278 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197284 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197289 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197295 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197300 | orchestrator | 2026-04-20 00:57:41.197306 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-04-20 00:57:41.197312 | orchestrator | Monday 20 April 2026 00:50:03 +0000 (0:00:00.584) 0:03:00.301 ********** 2026-04-20 00:57:41.197318 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.197324 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.197330 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197335 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.197342 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197347 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197353 | orchestrator | 2026-04-20 00:57:41.197359 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-04-20 00:57:41.197365 | orchestrator | Monday 20 April 2026 00:50:05 +0000 (0:00:01.113) 0:03:01.416 ********** 2026-04-20 00:57:41.197371 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.197378 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.197383 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.197388 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197394 | orchestrator | 2026-04-20 00:57:41.197400 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-04-20 00:57:41.197406 | orchestrator | Monday 20 April 2026 00:50:05 +0000 (0:00:00.441) 0:03:01.857 ********** 2026-04-20 00:57:41.197411 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.197417 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.197422 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.197428 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197434 | orchestrator | 2026-04-20 00:57:41.197440 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-04-20 00:57:41.197446 | orchestrator | Monday 20 April 2026 00:50:06 +0000 (0:00:00.607) 0:03:02.465 ********** 2026-04-20 00:57:41.197452 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.197458 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.197464 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.197470 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197476 | orchestrator | 2026-04-20 00:57:41.197519 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-04-20 00:57:41.197537 | orchestrator | Monday 20 April 2026 00:50:06 +0000 (0:00:00.479) 0:03:02.944 ********** 2026-04-20 00:57:41.197543 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.197551 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.197556 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.197576 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197582 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197587 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197593 | orchestrator | 2026-04-20 00:57:41.197599 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-04-20 00:57:41.197604 | orchestrator | Monday 20 April 2026 00:50:07 +0000 (0:00:00.886) 0:03:03.831 ********** 2026-04-20 00:57:41.197610 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-04-20 00:57:41.197616 | orchestrator | skipping: [testbed-node-0] => (item=0)  2026-04-20 00:57:41.197622 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-20 00:57:41.197628 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-20 00:57:41.197634 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197641 | orchestrator | skipping: [testbed-node-1] => (item=0)  2026-04-20 00:57:41.197645 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197649 | orchestrator | skipping: [testbed-node-2] => (item=0)  2026-04-20 00:57:41.197653 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197656 | orchestrator | 2026-04-20 00:57:41.197661 | orchestrator | TASK [ceph-config : Generate Ceph file] **************************************** 2026-04-20 00:57:41.197664 | orchestrator | Monday 20 April 2026 00:50:09 +0000 (0:00:01.740) 0:03:05.571 ********** 2026-04-20 00:57:41.197668 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.197672 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.197676 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.197680 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.197684 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.197687 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.197691 | orchestrator | 2026-04-20 00:57:41.197695 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-20 00:57:41.197699 | orchestrator | Monday 20 April 2026 00:50:11 +0000 (0:00:02.540) 0:03:08.111 ********** 2026-04-20 00:57:41.197702 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.197706 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.197710 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.197714 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.197717 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.197721 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.197725 | orchestrator | 2026-04-20 00:57:41.197728 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2026-04-20 00:57:41.197732 | orchestrator | Monday 20 April 2026 00:50:12 +0000 (0:00:01.162) 0:03:09.274 ********** 2026-04-20 00:57:41.197736 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197740 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197743 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197748 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.197752 | orchestrator | 2026-04-20 00:57:41.197761 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2026-04-20 00:57:41.197765 | orchestrator | Monday 20 April 2026 00:50:13 +0000 (0:00:00.704) 0:03:09.978 ********** 2026-04-20 00:57:41.197769 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.197773 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.197776 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.197780 | orchestrator | 2026-04-20 00:57:41.197784 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2026-04-20 00:57:41.197788 | orchestrator | Monday 20 April 2026 00:50:13 +0000 (0:00:00.254) 0:03:10.233 ********** 2026-04-20 00:57:41.197791 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.197799 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.197803 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.197807 | orchestrator | 2026-04-20 00:57:41.197811 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2026-04-20 00:57:41.197814 | orchestrator | Monday 20 April 2026 00:50:15 +0000 (0:00:01.432) 0:03:11.665 ********** 2026-04-20 00:57:41.197818 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:57:41.197822 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:57:41.197826 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:57:41.197829 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197833 | orchestrator | 2026-04-20 00:57:41.197837 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2026-04-20 00:57:41.197840 | orchestrator | Monday 20 April 2026 00:50:15 +0000 (0:00:00.545) 0:03:12.211 ********** 2026-04-20 00:57:41.197844 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.197848 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.197852 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.197855 | orchestrator | 2026-04-20 00:57:41.197859 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2026-04-20 00:57:41.197863 | orchestrator | Monday 20 April 2026 00:50:16 +0000 (0:00:00.291) 0:03:12.502 ********** 2026-04-20 00:57:41.197866 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.197870 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.197874 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.197877 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.197881 | orchestrator | 2026-04-20 00:57:41.197885 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2026-04-20 00:57:41.197889 | orchestrator | Monday 20 April 2026 00:50:17 +0000 (0:00:00.899) 0:03:13.402 ********** 2026-04-20 00:57:41.197892 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.197896 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.197900 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.197903 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197907 | orchestrator | 2026-04-20 00:57:41.197933 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2026-04-20 00:57:41.197938 | orchestrator | Monday 20 April 2026 00:50:17 +0000 (0:00:00.333) 0:03:13.736 ********** 2026-04-20 00:57:41.197942 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197946 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197949 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197953 | orchestrator | 2026-04-20 00:57:41.197957 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2026-04-20 00:57:41.197961 | orchestrator | Monday 20 April 2026 00:50:17 +0000 (0:00:00.306) 0:03:14.042 ********** 2026-04-20 00:57:41.197965 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197968 | orchestrator | 2026-04-20 00:57:41.197972 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2026-04-20 00:57:41.197976 | orchestrator | Monday 20 April 2026 00:50:17 +0000 (0:00:00.188) 0:03:14.230 ********** 2026-04-20 00:57:41.197980 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.197983 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.197987 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.197991 | orchestrator | 2026-04-20 00:57:41.197995 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2026-04-20 00:57:41.197998 | orchestrator | Monday 20 April 2026 00:50:18 +0000 (0:00:00.276) 0:03:14.506 ********** 2026-04-20 00:57:41.198002 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198006 | orchestrator | 2026-04-20 00:57:41.198010 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2026-04-20 00:57:41.198061 | orchestrator | Monday 20 April 2026 00:50:18 +0000 (0:00:00.179) 0:03:14.686 ********** 2026-04-20 00:57:41.198069 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198073 | orchestrator | 2026-04-20 00:57:41.198077 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2026-04-20 00:57:41.198081 | orchestrator | Monday 20 April 2026 00:50:18 +0000 (0:00:00.611) 0:03:15.297 ********** 2026-04-20 00:57:41.198085 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198090 | orchestrator | 2026-04-20 00:57:41.198094 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2026-04-20 00:57:41.198098 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:00.103) 0:03:15.401 ********** 2026-04-20 00:57:41.198102 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198108 | orchestrator | 2026-04-20 00:57:41.198114 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2026-04-20 00:57:41.198121 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:00.181) 0:03:15.582 ********** 2026-04-20 00:57:41.198129 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198139 | orchestrator | 2026-04-20 00:57:41.198147 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2026-04-20 00:57:41.198153 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:00.214) 0:03:15.797 ********** 2026-04-20 00:57:41.198159 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.198167 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.198173 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.198179 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198186 | orchestrator | 2026-04-20 00:57:41.198197 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2026-04-20 00:57:41.198204 | orchestrator | Monday 20 April 2026 00:50:19 +0000 (0:00:00.362) 0:03:16.159 ********** 2026-04-20 00:57:41.198224 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198230 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.198235 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.198240 | orchestrator | 2026-04-20 00:57:41.198247 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2026-04-20 00:57:41.198253 | orchestrator | Monday 20 April 2026 00:50:20 +0000 (0:00:00.275) 0:03:16.435 ********** 2026-04-20 00:57:41.198258 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198264 | orchestrator | 2026-04-20 00:57:41.198271 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2026-04-20 00:57:41.198277 | orchestrator | Monday 20 April 2026 00:50:20 +0000 (0:00:00.205) 0:03:16.640 ********** 2026-04-20 00:57:41.198283 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198289 | orchestrator | 2026-04-20 00:57:41.198295 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2026-04-20 00:57:41.198301 | orchestrator | Monday 20 April 2026 00:50:20 +0000 (0:00:00.211) 0:03:16.852 ********** 2026-04-20 00:57:41.198307 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.198313 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.198318 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.198324 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-5, testbed-node-4 2026-04-20 00:57:41.198330 | orchestrator | 2026-04-20 00:57:41.198336 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2026-04-20 00:57:41.198342 | orchestrator | Monday 20 April 2026 00:50:21 +0000 (0:00:01.003) 0:03:17.855 ********** 2026-04-20 00:57:41.198347 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.198353 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.198359 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.198365 | orchestrator | 2026-04-20 00:57:41.198370 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2026-04-20 00:57:41.198376 | orchestrator | Monday 20 April 2026 00:50:21 +0000 (0:00:00.254) 0:03:18.110 ********** 2026-04-20 00:57:41.198381 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.198398 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.198404 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.198410 | orchestrator | 2026-04-20 00:57:41.198416 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2026-04-20 00:57:41.198424 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:01.390) 0:03:19.501 ********** 2026-04-20 00:57:41.198427 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.198431 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.198435 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.198439 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198442 | orchestrator | 2026-04-20 00:57:41.198471 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2026-04-20 00:57:41.198476 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:00.531) 0:03:20.032 ********** 2026-04-20 00:57:41.198480 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.198483 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.198487 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.198491 | orchestrator | 2026-04-20 00:57:41.198495 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2026-04-20 00:57:41.198499 | orchestrator | Monday 20 April 2026 00:50:23 +0000 (0:00:00.266) 0:03:20.299 ********** 2026-04-20 00:57:41.198502 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.198506 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.198510 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.198513 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.198517 | orchestrator | 2026-04-20 00:57:41.198521 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2026-04-20 00:57:41.198525 | orchestrator | Monday 20 April 2026 00:50:24 +0000 (0:00:00.890) 0:03:21.189 ********** 2026-04-20 00:57:41.198529 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.198532 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.198536 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.198540 | orchestrator | 2026-04-20 00:57:41.198544 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2026-04-20 00:57:41.198548 | orchestrator | Monday 20 April 2026 00:50:25 +0000 (0:00:00.407) 0:03:21.596 ********** 2026-04-20 00:57:41.198551 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.198555 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.198559 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.198562 | orchestrator | 2026-04-20 00:57:41.198566 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2026-04-20 00:57:41.198570 | orchestrator | Monday 20 April 2026 00:50:26 +0000 (0:00:01.280) 0:03:22.876 ********** 2026-04-20 00:57:41.198574 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.198577 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.198581 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.198585 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198589 | orchestrator | 2026-04-20 00:57:41.198592 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2026-04-20 00:57:41.198596 | orchestrator | Monday 20 April 2026 00:50:27 +0000 (0:00:00.694) 0:03:23.571 ********** 2026-04-20 00:57:41.198600 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.198604 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.198608 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.198614 | orchestrator | 2026-04-20 00:57:41.198620 | orchestrator | RUNNING HANDLER [ceph-handler : Rbdmirrors handler] **************************** 2026-04-20 00:57:41.198626 | orchestrator | Monday 20 April 2026 00:50:27 +0000 (0:00:00.282) 0:03:23.853 ********** 2026-04-20 00:57:41.198632 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198638 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.198648 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.198659 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.198666 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.198672 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.198678 | orchestrator | 2026-04-20 00:57:41.198684 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2026-04-20 00:57:41.198690 | orchestrator | Monday 20 April 2026 00:50:28 +0000 (0:00:00.678) 0:03:24.531 ********** 2026-04-20 00:57:41.198697 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.198703 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.198709 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.198716 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.198722 | orchestrator | 2026-04-20 00:57:41.198729 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2026-04-20 00:57:41.198736 | orchestrator | Monday 20 April 2026 00:50:28 +0000 (0:00:00.835) 0:03:25.367 ********** 2026-04-20 00:57:41.198740 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.198743 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.198747 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.198751 | orchestrator | 2026-04-20 00:57:41.198754 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2026-04-20 00:57:41.198759 | orchestrator | Monday 20 April 2026 00:50:29 +0000 (0:00:00.294) 0:03:25.661 ********** 2026-04-20 00:57:41.198765 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.198771 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.198777 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.198782 | orchestrator | 2026-04-20 00:57:41.198787 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2026-04-20 00:57:41.198793 | orchestrator | Monday 20 April 2026 00:50:30 +0000 (0:00:01.369) 0:03:27.031 ********** 2026-04-20 00:57:41.198798 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:57:41.198803 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:57:41.198810 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:57:41.198815 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.198821 | orchestrator | 2026-04-20 00:57:41.198828 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2026-04-20 00:57:41.198834 | orchestrator | Monday 20 April 2026 00:50:31 +0000 (0:00:00.833) 0:03:27.864 ********** 2026-04-20 00:57:41.198841 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.198847 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.198853 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.198860 | orchestrator | 2026-04-20 00:57:41.198866 | orchestrator | PLAY [Apply role ceph-mon] ***************************************************** 2026-04-20 00:57:41.198872 | orchestrator | 2026-04-20 00:57:41.198878 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.198884 | orchestrator | Monday 20 April 2026 00:50:32 +0000 (0:00:00.833) 0:03:28.698 ********** 2026-04-20 00:57:41.198915 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.198920 | orchestrator | 2026-04-20 00:57:41.198924 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.198928 | orchestrator | Monday 20 April 2026 00:50:32 +0000 (0:00:00.522) 0:03:29.221 ********** 2026-04-20 00:57:41.198932 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.198936 | orchestrator | 2026-04-20 00:57:41.198940 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.198943 | orchestrator | Monday 20 April 2026 00:50:33 +0000 (0:00:00.726) 0:03:29.947 ********** 2026-04-20 00:57:41.198947 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.198951 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.198960 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.198963 | orchestrator | 2026-04-20 00:57:41.198967 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.198971 | orchestrator | Monday 20 April 2026 00:50:34 +0000 (0:00:00.705) 0:03:30.653 ********** 2026-04-20 00:57:41.198975 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.198979 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.198982 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.198986 | orchestrator | 2026-04-20 00:57:41.198990 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.198994 | orchestrator | Monday 20 April 2026 00:50:34 +0000 (0:00:00.281) 0:03:30.934 ********** 2026-04-20 00:57:41.198997 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199001 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199005 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199009 | orchestrator | 2026-04-20 00:57:41.199012 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.199016 | orchestrator | Monday 20 April 2026 00:50:34 +0000 (0:00:00.299) 0:03:31.234 ********** 2026-04-20 00:57:41.199020 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199024 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199028 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199031 | orchestrator | 2026-04-20 00:57:41.199035 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.199039 | orchestrator | Monday 20 April 2026 00:50:35 +0000 (0:00:00.283) 0:03:31.517 ********** 2026-04-20 00:57:41.199044 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199050 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199056 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199066 | orchestrator | 2026-04-20 00:57:41.199072 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.199078 | orchestrator | Monday 20 April 2026 00:50:35 +0000 (0:00:00.799) 0:03:32.317 ********** 2026-04-20 00:57:41.199084 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199089 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199096 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199101 | orchestrator | 2026-04-20 00:57:41.199107 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.199118 | orchestrator | Monday 20 April 2026 00:50:36 +0000 (0:00:00.260) 0:03:32.578 ********** 2026-04-20 00:57:41.199124 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199131 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199137 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199142 | orchestrator | 2026-04-20 00:57:41.199147 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.199153 | orchestrator | Monday 20 April 2026 00:50:36 +0000 (0:00:00.254) 0:03:32.832 ********** 2026-04-20 00:57:41.199161 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199170 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199176 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199181 | orchestrator | 2026-04-20 00:57:41.199187 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.199194 | orchestrator | Monday 20 April 2026 00:50:37 +0000 (0:00:00.611) 0:03:33.444 ********** 2026-04-20 00:57:41.199200 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199206 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199231 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199236 | orchestrator | 2026-04-20 00:57:41.199241 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.199246 | orchestrator | Monday 20 April 2026 00:50:37 +0000 (0:00:00.881) 0:03:34.326 ********** 2026-04-20 00:57:41.199251 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199257 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199263 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199274 | orchestrator | 2026-04-20 00:57:41.199280 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.199286 | orchestrator | Monday 20 April 2026 00:50:38 +0000 (0:00:00.282) 0:03:34.608 ********** 2026-04-20 00:57:41.199291 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199297 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199302 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199308 | orchestrator | 2026-04-20 00:57:41.199314 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.199320 | orchestrator | Monday 20 April 2026 00:50:38 +0000 (0:00:00.267) 0:03:34.876 ********** 2026-04-20 00:57:41.199326 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199332 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199338 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199345 | orchestrator | 2026-04-20 00:57:41.199349 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.199353 | orchestrator | Monday 20 April 2026 00:50:38 +0000 (0:00:00.269) 0:03:35.145 ********** 2026-04-20 00:57:41.199356 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199360 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199364 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199367 | orchestrator | 2026-04-20 00:57:41.199371 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.199375 | orchestrator | Monday 20 April 2026 00:50:39 +0000 (0:00:00.403) 0:03:35.549 ********** 2026-04-20 00:57:41.199402 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199407 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199410 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199414 | orchestrator | 2026-04-20 00:57:41.199418 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.199422 | orchestrator | Monday 20 April 2026 00:50:39 +0000 (0:00:00.271) 0:03:35.821 ********** 2026-04-20 00:57:41.199425 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199429 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199433 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199436 | orchestrator | 2026-04-20 00:57:41.199440 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.199444 | orchestrator | Monday 20 April 2026 00:50:39 +0000 (0:00:00.247) 0:03:36.068 ********** 2026-04-20 00:57:41.199448 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199451 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.199455 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.199459 | orchestrator | 2026-04-20 00:57:41.199462 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.199466 | orchestrator | Monday 20 April 2026 00:50:39 +0000 (0:00:00.256) 0:03:36.324 ********** 2026-04-20 00:57:41.199470 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199474 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199477 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199481 | orchestrator | 2026-04-20 00:57:41.199485 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.199489 | orchestrator | Monday 20 April 2026 00:50:40 +0000 (0:00:00.321) 0:03:36.646 ********** 2026-04-20 00:57:41.199492 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199496 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199500 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199504 | orchestrator | 2026-04-20 00:57:41.199507 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.199511 | orchestrator | Monday 20 April 2026 00:50:40 +0000 (0:00:00.550) 0:03:37.197 ********** 2026-04-20 00:57:41.199515 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199519 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199522 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199526 | orchestrator | 2026-04-20 00:57:41.199530 | orchestrator | TASK [ceph-mon : Set_fact container_exec_cmd] ********************************** 2026-04-20 00:57:41.199543 | orchestrator | Monday 20 April 2026 00:50:41 +0000 (0:00:00.515) 0:03:37.712 ********** 2026-04-20 00:57:41.199546 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199550 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199554 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199558 | orchestrator | 2026-04-20 00:57:41.199561 | orchestrator | TASK [ceph-mon : Include deploy_monitors.yml] ********************************** 2026-04-20 00:57:41.199565 | orchestrator | Monday 20 April 2026 00:50:41 +0000 (0:00:00.290) 0:03:38.003 ********** 2026-04-20 00:57:41.199570 | orchestrator | included: /ansible/roles/ceph-mon/tasks/deploy_monitors.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.199573 | orchestrator | 2026-04-20 00:57:41.199577 | orchestrator | TASK [ceph-mon : Check if monitor initial keyring already exists] ************** 2026-04-20 00:57:41.199584 | orchestrator | Monday 20 April 2026 00:50:42 +0000 (0:00:00.771) 0:03:38.774 ********** 2026-04-20 00:57:41.199588 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.199592 | orchestrator | 2026-04-20 00:57:41.199596 | orchestrator | TASK [ceph-mon : Generate monitor initial keyring] ***************************** 2026-04-20 00:57:41.199599 | orchestrator | Monday 20 April 2026 00:50:42 +0000 (0:00:00.161) 0:03:38.936 ********** 2026-04-20 00:57:41.199603 | orchestrator | changed: [testbed-node-0 -> localhost] 2026-04-20 00:57:41.199607 | orchestrator | 2026-04-20 00:57:41.199611 | orchestrator | TASK [ceph-mon : Set_fact _initial_mon_key_success] **************************** 2026-04-20 00:57:41.199614 | orchestrator | Monday 20 April 2026 00:50:43 +0000 (0:00:01.023) 0:03:39.959 ********** 2026-04-20 00:57:41.199618 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199622 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199626 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199629 | orchestrator | 2026-04-20 00:57:41.199633 | orchestrator | TASK [ceph-mon : Get initial keyring when it already exists] ******************* 2026-04-20 00:57:41.199637 | orchestrator | Monday 20 April 2026 00:50:43 +0000 (0:00:00.343) 0:03:40.303 ********** 2026-04-20 00:57:41.199640 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199644 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199648 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199652 | orchestrator | 2026-04-20 00:57:41.199655 | orchestrator | TASK [ceph-mon : Create monitor initial keyring] ******************************* 2026-04-20 00:57:41.199659 | orchestrator | Monday 20 April 2026 00:50:44 +0000 (0:00:00.383) 0:03:40.687 ********** 2026-04-20 00:57:41.199663 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.199667 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.199670 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.199674 | orchestrator | 2026-04-20 00:57:41.199678 | orchestrator | TASK [ceph-mon : Copy the initial key in /etc/ceph (for containers)] *********** 2026-04-20 00:57:41.199681 | orchestrator | Monday 20 April 2026 00:50:45 +0000 (0:00:01.603) 0:03:42.290 ********** 2026-04-20 00:57:41.199685 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.199689 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.199692 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.199696 | orchestrator | 2026-04-20 00:57:41.199700 | orchestrator | TASK [ceph-mon : Create monitor directory] ************************************* 2026-04-20 00:57:41.199704 | orchestrator | Monday 20 April 2026 00:50:46 +0000 (0:00:00.730) 0:03:43.020 ********** 2026-04-20 00:57:41.199707 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.199711 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.199715 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.199718 | orchestrator | 2026-04-20 00:57:41.199722 | orchestrator | TASK [ceph-mon : Recursively fix ownership of monitor directory] *************** 2026-04-20 00:57:41.199726 | orchestrator | Monday 20 April 2026 00:50:47 +0000 (0:00:00.631) 0:03:43.651 ********** 2026-04-20 00:57:41.199730 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199733 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199737 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199741 | orchestrator | 2026-04-20 00:57:41.199745 | orchestrator | TASK [ceph-mon : Create admin keyring] ***************************************** 2026-04-20 00:57:41.199768 | orchestrator | Monday 20 April 2026 00:50:47 +0000 (0:00:00.645) 0:03:44.297 ********** 2026-04-20 00:57:41.199773 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.199776 | orchestrator | 2026-04-20 00:57:41.199780 | orchestrator | TASK [ceph-mon : Slurp admin keyring] ****************************************** 2026-04-20 00:57:41.199784 | orchestrator | Monday 20 April 2026 00:50:49 +0000 (0:00:01.545) 0:03:45.842 ********** 2026-04-20 00:57:41.199788 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199792 | orchestrator | 2026-04-20 00:57:41.199799 | orchestrator | TASK [ceph-mon : Copy admin keyring over to mons] ****************************** 2026-04-20 00:57:41.199805 | orchestrator | Monday 20 April 2026 00:50:50 +0000 (0:00:00.783) 0:03:46.625 ********** 2026-04-20 00:57:41.199811 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-20 00:57:41.199817 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.199823 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.199830 | orchestrator | changed: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:57:41.199836 | orchestrator | ok: [testbed-node-1] => (item=None) 2026-04-20 00:57:41.199842 | orchestrator | ok: [testbed-node-2 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:57:41.199849 | orchestrator | changed: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:57:41.199855 | orchestrator | changed: [testbed-node-0 -> {{ item }}] 2026-04-20 00:57:41.199861 | orchestrator | ok: [testbed-node-1 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:57:41.199867 | orchestrator | ok: [testbed-node-1 -> {{ item }}] 2026-04-20 00:57:41.199872 | orchestrator | ok: [testbed-node-2] => (item=None) 2026-04-20 00:57:41.199876 | orchestrator | ok: [testbed-node-2 -> {{ item }}] 2026-04-20 00:57:41.199880 | orchestrator | 2026-04-20 00:57:41.199884 | orchestrator | TASK [ceph-mon : Import admin keyring into mon keyring] ************************ 2026-04-20 00:57:41.199887 | orchestrator | Monday 20 April 2026 00:50:53 +0000 (0:00:03.453) 0:03:50.079 ********** 2026-04-20 00:57:41.199891 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.199895 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.199899 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.199902 | orchestrator | 2026-04-20 00:57:41.199906 | orchestrator | TASK [ceph-mon : Set_fact ceph-mon container command] ************************** 2026-04-20 00:57:41.199910 | orchestrator | Monday 20 April 2026 00:50:54 +0000 (0:00:01.173) 0:03:51.253 ********** 2026-04-20 00:57:41.199913 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199917 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199921 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199925 | orchestrator | 2026-04-20 00:57:41.199928 | orchestrator | TASK [ceph-mon : Set_fact monmaptool container command] ************************ 2026-04-20 00:57:41.199932 | orchestrator | Monday 20 April 2026 00:50:55 +0000 (0:00:00.310) 0:03:51.563 ********** 2026-04-20 00:57:41.199936 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.199939 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.199957 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.199961 | orchestrator | 2026-04-20 00:57:41.199964 | orchestrator | TASK [ceph-mon : Generate initial monmap] ************************************** 2026-04-20 00:57:41.199972 | orchestrator | Monday 20 April 2026 00:50:55 +0000 (0:00:00.307) 0:03:51.870 ********** 2026-04-20 00:57:41.199976 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.199979 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.199983 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.199987 | orchestrator | 2026-04-20 00:57:41.199991 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs with keyring] ******************************* 2026-04-20 00:57:41.199994 | orchestrator | Monday 20 April 2026 00:50:57 +0000 (0:00:01.936) 0:03:53.807 ********** 2026-04-20 00:57:41.199998 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.200002 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.200006 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.200014 | orchestrator | 2026-04-20 00:57:41.200018 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs without keyring] **************************** 2026-04-20 00:57:41.200022 | orchestrator | Monday 20 April 2026 00:50:58 +0000 (0:00:01.318) 0:03:55.125 ********** 2026-04-20 00:57:41.200025 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200029 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200033 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200037 | orchestrator | 2026-04-20 00:57:41.200040 | orchestrator | TASK [ceph-mon : Include start_monitor.yml] ************************************ 2026-04-20 00:57:41.200044 | orchestrator | Monday 20 April 2026 00:50:59 +0000 (0:00:00.300) 0:03:55.426 ********** 2026-04-20 00:57:41.200048 | orchestrator | included: /ansible/roles/ceph-mon/tasks/start_monitor.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.200052 | orchestrator | 2026-04-20 00:57:41.200055 | orchestrator | TASK [ceph-mon : Ensure systemd service override directory exists] ************* 2026-04-20 00:57:41.200059 | orchestrator | Monday 20 April 2026 00:50:59 +0000 (0:00:00.794) 0:03:56.220 ********** 2026-04-20 00:57:41.200063 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200067 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200071 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200074 | orchestrator | 2026-04-20 00:57:41.200078 | orchestrator | TASK [ceph-mon : Add ceph-mon systemd service overrides] *********************** 2026-04-20 00:57:41.200082 | orchestrator | Monday 20 April 2026 00:51:00 +0000 (0:00:00.325) 0:03:56.546 ********** 2026-04-20 00:57:41.200086 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200089 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200093 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200097 | orchestrator | 2026-04-20 00:57:41.200100 | orchestrator | TASK [ceph-mon : Include_tasks systemd.yml] ************************************ 2026-04-20 00:57:41.200104 | orchestrator | Monday 20 April 2026 00:51:00 +0000 (0:00:00.317) 0:03:56.863 ********** 2026-04-20 00:57:41.200108 | orchestrator | included: /ansible/roles/ceph-mon/tasks/systemd.yml for testbed-node-1, testbed-node-0, testbed-node-2 2026-04-20 00:57:41.200112 | orchestrator | 2026-04-20 00:57:41.200116 | orchestrator | TASK [ceph-mon : Generate systemd unit file for mon container] ***************** 2026-04-20 00:57:41.200134 | orchestrator | Monday 20 April 2026 00:51:01 +0000 (0:00:00.841) 0:03:57.705 ********** 2026-04-20 00:57:41.200139 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.200143 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.200147 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.200151 | orchestrator | 2026-04-20 00:57:41.200155 | orchestrator | TASK [ceph-mon : Generate systemd ceph-mon target file] ************************ 2026-04-20 00:57:41.200158 | orchestrator | Monday 20 April 2026 00:51:03 +0000 (0:00:01.912) 0:03:59.617 ********** 2026-04-20 00:57:41.200162 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.200166 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.200170 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.200173 | orchestrator | 2026-04-20 00:57:41.200177 | orchestrator | TASK [ceph-mon : Enable ceph-mon.target] *************************************** 2026-04-20 00:57:41.200181 | orchestrator | Monday 20 April 2026 00:51:04 +0000 (0:00:01.262) 0:04:00.880 ********** 2026-04-20 00:57:41.200185 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.200188 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.200192 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.200196 | orchestrator | 2026-04-20 00:57:41.200200 | orchestrator | TASK [ceph-mon : Start the monitor service] ************************************ 2026-04-20 00:57:41.200203 | orchestrator | Monday 20 April 2026 00:51:06 +0000 (0:00:01.859) 0:04:02.739 ********** 2026-04-20 00:57:41.200228 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.200233 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.200237 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.200241 | orchestrator | 2026-04-20 00:57:41.200245 | orchestrator | TASK [ceph-mon : Include_tasks ceph_keys.yml] ********************************** 2026-04-20 00:57:41.200253 | orchestrator | Monday 20 April 2026 00:51:08 +0000 (0:00:02.183) 0:04:04.923 ********** 2026-04-20 00:57:41.200257 | orchestrator | included: /ansible/roles/ceph-mon/tasks/ceph_keys.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.200261 | orchestrator | 2026-04-20 00:57:41.200264 | orchestrator | TASK [ceph-mon : Waiting for the monitor(s) to form the quorum...] ************* 2026-04-20 00:57:41.200268 | orchestrator | Monday 20 April 2026 00:51:09 +0000 (0:00:00.526) 0:04:05.449 ********** 2026-04-20 00:57:41.200272 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for the monitor(s) to form the quorum... (10 retries left). 2026-04-20 00:57:41.200276 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200280 | orchestrator | 2026-04-20 00:57:41.200283 | orchestrator | TASK [ceph-mon : Fetch ceph initial keys] ************************************** 2026-04-20 00:57:41.200287 | orchestrator | Monday 20 April 2026 00:51:31 +0000 (0:00:22.048) 0:04:27.497 ********** 2026-04-20 00:57:41.200291 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200295 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200298 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200302 | orchestrator | 2026-04-20 00:57:41.200306 | orchestrator | TASK [ceph-mon : Include secure_cluster.yml] *********************************** 2026-04-20 00:57:41.200309 | orchestrator | Monday 20 April 2026 00:51:40 +0000 (0:00:09.110) 0:04:36.608 ********** 2026-04-20 00:57:41.200313 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200317 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200321 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200324 | orchestrator | 2026-04-20 00:57:41.200331 | orchestrator | TASK [ceph-mon : Set cluster configs] ****************************************** 2026-04-20 00:57:41.200335 | orchestrator | Monday 20 April 2026 00:51:40 +0000 (0:00:00.545) 0:04:37.154 ********** 2026-04-20 00:57:41.200341 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}}, {'key': 'public_network', 'value': '192.168.16.0/20'}]) 2026-04-20 00:57:41.200348 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}}, {'key': 'cluster_network', 'value': '192.168.16.0/20'}]) 2026-04-20 00:57:41.200353 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}}, {'key': 'osd_pool_default_crush_rule', 'value': -1}]) 2026-04-20 00:57:41.200359 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}}, {'key': 'ms_bind_ipv6', 'value': 'False'}]) 2026-04-20 00:57:41.200378 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}}, {'key': 'ms_bind_ipv4', 'value': 'True'}]) 2026-04-20 00:57:41.200384 | orchestrator | skipping: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}}, {'key': 'osd_crush_chooseleaf_type', 'value': '__omit_place_holder__34545e4c793a90cc89c2b870ac5752b74afe9328'}])  2026-04-20 00:57:41.200393 | orchestrator | 2026-04-20 00:57:41.200397 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-20 00:57:41.200401 | orchestrator | Monday 20 April 2026 00:51:55 +0000 (0:00:14.419) 0:04:51.574 ********** 2026-04-20 00:57:41.200405 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200409 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200412 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200416 | orchestrator | 2026-04-20 00:57:41.200420 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2026-04-20 00:57:41.200424 | orchestrator | Monday 20 April 2026 00:51:55 +0000 (0:00:00.283) 0:04:51.857 ********** 2026-04-20 00:57:41.200427 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.200431 | orchestrator | 2026-04-20 00:57:41.200435 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2026-04-20 00:57:41.200439 | orchestrator | Monday 20 April 2026 00:51:56 +0000 (0:00:00.607) 0:04:52.464 ********** 2026-04-20 00:57:41.200442 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200446 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200450 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200454 | orchestrator | 2026-04-20 00:57:41.200457 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2026-04-20 00:57:41.200461 | orchestrator | Monday 20 April 2026 00:51:56 +0000 (0:00:00.265) 0:04:52.730 ********** 2026-04-20 00:57:41.200465 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200469 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200472 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200476 | orchestrator | 2026-04-20 00:57:41.200480 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2026-04-20 00:57:41.200483 | orchestrator | Monday 20 April 2026 00:51:56 +0000 (0:00:00.270) 0:04:53.001 ********** 2026-04-20 00:57:41.200487 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:57:41.200491 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:57:41.200495 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:57:41.200498 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200502 | orchestrator | 2026-04-20 00:57:41.200506 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2026-04-20 00:57:41.200512 | orchestrator | Monday 20 April 2026 00:51:57 +0000 (0:00:00.671) 0:04:53.672 ********** 2026-04-20 00:57:41.200516 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200520 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200523 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200527 | orchestrator | 2026-04-20 00:57:41.200531 | orchestrator | PLAY [Apply role ceph-mgr] ***************************************************** 2026-04-20 00:57:41.200535 | orchestrator | 2026-04-20 00:57:41.200538 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.200542 | orchestrator | Monday 20 April 2026 00:51:57 +0000 (0:00:00.653) 0:04:54.325 ********** 2026-04-20 00:57:41.200546 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.200550 | orchestrator | 2026-04-20 00:57:41.200554 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.200558 | orchestrator | Monday 20 April 2026 00:51:58 +0000 (0:00:00.456) 0:04:54.782 ********** 2026-04-20 00:57:41.200562 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.200565 | orchestrator | 2026-04-20 00:57:41.200569 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.200581 | orchestrator | Monday 20 April 2026 00:51:59 +0000 (0:00:00.621) 0:04:55.404 ********** 2026-04-20 00:57:41.200584 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200588 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200592 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200595 | orchestrator | 2026-04-20 00:57:41.200599 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.200603 | orchestrator | Monday 20 April 2026 00:51:59 +0000 (0:00:00.698) 0:04:56.102 ********** 2026-04-20 00:57:41.200607 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200610 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200614 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200618 | orchestrator | 2026-04-20 00:57:41.200622 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.200625 | orchestrator | Monday 20 April 2026 00:51:59 +0000 (0:00:00.258) 0:04:56.360 ********** 2026-04-20 00:57:41.200629 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200633 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200636 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200640 | orchestrator | 2026-04-20 00:57:41.200644 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.200648 | orchestrator | Monday 20 April 2026 00:52:00 +0000 (0:00:00.294) 0:04:56.655 ********** 2026-04-20 00:57:41.200651 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200655 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200659 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200662 | orchestrator | 2026-04-20 00:57:41.200666 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.200683 | orchestrator | Monday 20 April 2026 00:52:00 +0000 (0:00:00.463) 0:04:57.119 ********** 2026-04-20 00:57:41.200687 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200691 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200695 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200699 | orchestrator | 2026-04-20 00:57:41.200702 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.200706 | orchestrator | Monday 20 April 2026 00:52:01 +0000 (0:00:00.664) 0:04:57.783 ********** 2026-04-20 00:57:41.200710 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200714 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200717 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200721 | orchestrator | 2026-04-20 00:57:41.200725 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.200729 | orchestrator | Monday 20 April 2026 00:52:01 +0000 (0:00:00.278) 0:04:58.062 ********** 2026-04-20 00:57:41.200732 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200736 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200740 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200743 | orchestrator | 2026-04-20 00:57:41.200747 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.200751 | orchestrator | Monday 20 April 2026 00:52:01 +0000 (0:00:00.248) 0:04:58.310 ********** 2026-04-20 00:57:41.200755 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200758 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200762 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200766 | orchestrator | 2026-04-20 00:57:41.200770 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.200773 | orchestrator | Monday 20 April 2026 00:52:02 +0000 (0:00:00.844) 0:04:59.154 ********** 2026-04-20 00:57:41.200777 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200781 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200785 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200788 | orchestrator | 2026-04-20 00:57:41.200792 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.200796 | orchestrator | Monday 20 April 2026 00:52:03 +0000 (0:00:00.745) 0:04:59.900 ********** 2026-04-20 00:57:41.200804 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200808 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200811 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200815 | orchestrator | 2026-04-20 00:57:41.200819 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.200823 | orchestrator | Monday 20 April 2026 00:52:03 +0000 (0:00:00.256) 0:05:00.157 ********** 2026-04-20 00:57:41.200826 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200830 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200834 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200838 | orchestrator | 2026-04-20 00:57:41.200841 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.200845 | orchestrator | Monday 20 April 2026 00:52:04 +0000 (0:00:00.346) 0:05:00.503 ********** 2026-04-20 00:57:41.200849 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200853 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200856 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200860 | orchestrator | 2026-04-20 00:57:41.200867 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.200871 | orchestrator | Monday 20 April 2026 00:52:04 +0000 (0:00:00.271) 0:05:00.775 ********** 2026-04-20 00:57:41.200874 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200878 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200882 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200885 | orchestrator | 2026-04-20 00:57:41.200889 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.200893 | orchestrator | Monday 20 April 2026 00:52:04 +0000 (0:00:00.407) 0:05:01.183 ********** 2026-04-20 00:57:41.200897 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200900 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200904 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200908 | orchestrator | 2026-04-20 00:57:41.200911 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.200915 | orchestrator | Monday 20 April 2026 00:52:05 +0000 (0:00:00.256) 0:05:01.439 ********** 2026-04-20 00:57:41.200919 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200923 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200926 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200930 | orchestrator | 2026-04-20 00:57:41.200934 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.200937 | orchestrator | Monday 20 April 2026 00:52:05 +0000 (0:00:00.271) 0:05:01.711 ********** 2026-04-20 00:57:41.200941 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.200945 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.200949 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.200952 | orchestrator | 2026-04-20 00:57:41.200956 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.200960 | orchestrator | Monday 20 April 2026 00:52:05 +0000 (0:00:00.258) 0:05:01.969 ********** 2026-04-20 00:57:41.200964 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200967 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200971 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200975 | orchestrator | 2026-04-20 00:57:41.200979 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.200982 | orchestrator | Monday 20 April 2026 00:52:06 +0000 (0:00:00.442) 0:05:02.411 ********** 2026-04-20 00:57:41.200986 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.200990 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.200994 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.200997 | orchestrator | 2026-04-20 00:57:41.201001 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.201005 | orchestrator | Monday 20 April 2026 00:52:06 +0000 (0:00:00.267) 0:05:02.679 ********** 2026-04-20 00:57:41.201008 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.201020 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.201026 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201032 | orchestrator | 2026-04-20 00:57:41.201038 | orchestrator | TASK [ceph-mgr : Set_fact container_exec_cmd] ********************************** 2026-04-20 00:57:41.201061 | orchestrator | Monday 20 April 2026 00:52:06 +0000 (0:00:00.462) 0:05:03.141 ********** 2026-04-20 00:57:41.201067 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2026-04-20 00:57:41.201073 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:57:41.201079 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:57:41.201086 | orchestrator | 2026-04-20 00:57:41.201092 | orchestrator | TASK [ceph-mgr : Include common.yml] ******************************************* 2026-04-20 00:57:41.201099 | orchestrator | Monday 20 April 2026 00:52:07 +0000 (0:00:00.722) 0:05:03.863 ********** 2026-04-20 00:57:41.201105 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/common.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.201111 | orchestrator | 2026-04-20 00:57:41.201116 | orchestrator | TASK [ceph-mgr : Create mgr directory] ***************************************** 2026-04-20 00:57:41.201122 | orchestrator | Monday 20 April 2026 00:52:08 +0000 (0:00:00.737) 0:05:04.601 ********** 2026-04-20 00:57:41.201129 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201133 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201137 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201141 | orchestrator | 2026-04-20 00:57:41.201144 | orchestrator | TASK [ceph-mgr : Fetch ceph mgr keyring] *************************************** 2026-04-20 00:57:41.201148 | orchestrator | Monday 20 April 2026 00:52:08 +0000 (0:00:00.640) 0:05:05.241 ********** 2026-04-20 00:57:41.201152 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201155 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.201159 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.201163 | orchestrator | 2026-04-20 00:57:41.201167 | orchestrator | TASK [ceph-mgr : Create ceph mgr keyring(s) on a mon node] ********************* 2026-04-20 00:57:41.201170 | orchestrator | Monday 20 April 2026 00:52:09 +0000 (0:00:00.292) 0:05:05.534 ********** 2026-04-20 00:57:41.201174 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-20 00:57:41.201178 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-20 00:57:41.201182 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-20 00:57:41.201186 | orchestrator | changed: [testbed-node-0 -> {{ groups[mon_group_name][0] }}] 2026-04-20 00:57:41.201189 | orchestrator | 2026-04-20 00:57:41.201193 | orchestrator | TASK [ceph-mgr : Set_fact _mgr_keys] ******************************************* 2026-04-20 00:57:41.201197 | orchestrator | Monday 20 April 2026 00:52:18 +0000 (0:00:09.436) 0:05:14.970 ********** 2026-04-20 00:57:41.201201 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.201204 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.201249 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201254 | orchestrator | 2026-04-20 00:57:41.201257 | orchestrator | TASK [ceph-mgr : Get keys from monitors] *************************************** 2026-04-20 00:57:41.201261 | orchestrator | Monday 20 April 2026 00:52:19 +0000 (0:00:00.494) 0:05:15.465 ********** 2026-04-20 00:57:41.201265 | orchestrator | skipping: [testbed-node-0] => (item=None)  2026-04-20 00:57:41.201269 | orchestrator | skipping: [testbed-node-1] => (item=None)  2026-04-20 00:57:41.201273 | orchestrator | skipping: [testbed-node-2] => (item=None)  2026-04-20 00:57:41.201277 | orchestrator | ok: [testbed-node-0] => (item=None) 2026-04-20 00:57:41.201284 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.201288 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.201292 | orchestrator | 2026-04-20 00:57:41.201296 | orchestrator | TASK [ceph-mgr : Copy ceph key(s) if needed] *********************************** 2026-04-20 00:57:41.201299 | orchestrator | Monday 20 April 2026 00:52:20 +0000 (0:00:01.890) 0:05:17.355 ********** 2026-04-20 00:57:41.201307 | orchestrator | skipping: [testbed-node-0] => (item=None)  2026-04-20 00:57:41.201311 | orchestrator | skipping: [testbed-node-1] => (item=None)  2026-04-20 00:57:41.201315 | orchestrator | skipping: [testbed-node-2] => (item=None)  2026-04-20 00:57:41.201319 | orchestrator | changed: [testbed-node-0] => (item=None) 2026-04-20 00:57:41.201323 | orchestrator | changed: [testbed-node-1] => (item=None) 2026-04-20 00:57:41.201329 | orchestrator | changed: [testbed-node-2] => (item=None) 2026-04-20 00:57:41.201335 | orchestrator | 2026-04-20 00:57:41.201341 | orchestrator | TASK [ceph-mgr : Set mgr key permissions] ************************************** 2026-04-20 00:57:41.201346 | orchestrator | Monday 20 April 2026 00:52:22 +0000 (0:00:01.045) 0:05:18.401 ********** 2026-04-20 00:57:41.201351 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.201358 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.201363 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201368 | orchestrator | 2026-04-20 00:57:41.201374 | orchestrator | TASK [ceph-mgr : Append dashboard modules to ceph_mgr_modules] ***************** 2026-04-20 00:57:41.201379 | orchestrator | Monday 20 April 2026 00:52:22 +0000 (0:00:00.586) 0:05:18.987 ********** 2026-04-20 00:57:41.201384 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201391 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.201397 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.201403 | orchestrator | 2026-04-20 00:57:41.201409 | orchestrator | TASK [ceph-mgr : Include pre_requisite.yml] ************************************ 2026-04-20 00:57:41.201414 | orchestrator | Monday 20 April 2026 00:52:23 +0000 (0:00:00.498) 0:05:19.485 ********** 2026-04-20 00:57:41.201420 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201426 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.201432 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.201437 | orchestrator | 2026-04-20 00:57:41.201444 | orchestrator | TASK [ceph-mgr : Include start_mgr.yml] **************************************** 2026-04-20 00:57:41.201449 | orchestrator | Monday 20 April 2026 00:52:23 +0000 (0:00:00.288) 0:05:19.774 ********** 2026-04-20 00:57:41.201456 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/start_mgr.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.201461 | orchestrator | 2026-04-20 00:57:41.201468 | orchestrator | TASK [ceph-mgr : Ensure systemd service override directory exists] ************* 2026-04-20 00:57:41.201474 | orchestrator | Monday 20 April 2026 00:52:23 +0000 (0:00:00.464) 0:05:20.239 ********** 2026-04-20 00:57:41.201479 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201486 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.201519 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.201525 | orchestrator | 2026-04-20 00:57:41.201529 | orchestrator | TASK [ceph-mgr : Add ceph-mgr systemd service overrides] *********************** 2026-04-20 00:57:41.201546 | orchestrator | Monday 20 April 2026 00:52:24 +0000 (0:00:00.432) 0:05:20.671 ********** 2026-04-20 00:57:41.201550 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201554 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.201558 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.201561 | orchestrator | 2026-04-20 00:57:41.201565 | orchestrator | TASK [ceph-mgr : Include_tasks systemd.yml] ************************************ 2026-04-20 00:57:41.201569 | orchestrator | Monday 20 April 2026 00:52:24 +0000 (0:00:00.287) 0:05:20.958 ********** 2026-04-20 00:57:41.201573 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.201576 | orchestrator | 2026-04-20 00:57:41.201580 | orchestrator | TASK [ceph-mgr : Generate systemd unit file] *********************************** 2026-04-20 00:57:41.201584 | orchestrator | Monday 20 April 2026 00:52:25 +0000 (0:00:00.487) 0:05:21.445 ********** 2026-04-20 00:57:41.201588 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201591 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201595 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201599 | orchestrator | 2026-04-20 00:57:41.201602 | orchestrator | TASK [ceph-mgr : Generate systemd ceph-mgr target file] ************************ 2026-04-20 00:57:41.201613 | orchestrator | Monday 20 April 2026 00:52:26 +0000 (0:00:01.186) 0:05:22.631 ********** 2026-04-20 00:57:41.201616 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201620 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201624 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201627 | orchestrator | 2026-04-20 00:57:41.201631 | orchestrator | TASK [ceph-mgr : Enable ceph-mgr.target] *************************************** 2026-04-20 00:57:41.201635 | orchestrator | Monday 20 April 2026 00:52:27 +0000 (0:00:01.293) 0:05:23.925 ********** 2026-04-20 00:57:41.201639 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201642 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201646 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201650 | orchestrator | 2026-04-20 00:57:41.201653 | orchestrator | TASK [ceph-mgr : Systemd start mgr] ******************************************** 2026-04-20 00:57:41.201657 | orchestrator | Monday 20 April 2026 00:52:29 +0000 (0:00:01.707) 0:05:25.632 ********** 2026-04-20 00:57:41.201661 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201665 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201668 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201672 | orchestrator | 2026-04-20 00:57:41.201676 | orchestrator | TASK [ceph-mgr : Include mgr_modules.yml] ************************************** 2026-04-20 00:57:41.201679 | orchestrator | Monday 20 April 2026 00:52:31 +0000 (0:00:01.869) 0:05:27.502 ********** 2026-04-20 00:57:41.201683 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201687 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.201691 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/mgr_modules.yml for testbed-node-2 2026-04-20 00:57:41.201694 | orchestrator | 2026-04-20 00:57:41.201698 | orchestrator | TASK [ceph-mgr : Wait for all mgr to be up] ************************************ 2026-04-20 00:57:41.201705 | orchestrator | Monday 20 April 2026 00:52:31 +0000 (0:00:00.416) 0:05:27.918 ********** 2026-04-20 00:57:41.201709 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (30 retries left). 2026-04-20 00:57:41.201713 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (29 retries left). 2026-04-20 00:57:41.201717 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (28 retries left). 2026-04-20 00:57:41.201721 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (27 retries left). 2026-04-20 00:57:41.201725 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (26 retries left). 2026-04-20 00:57:41.201728 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (25 retries left). 2026-04-20 00:57:41.201732 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.201736 | orchestrator | 2026-04-20 00:57:41.201740 | orchestrator | TASK [ceph-mgr : Get enabled modules from ceph-mgr] **************************** 2026-04-20 00:57:41.201743 | orchestrator | Monday 20 April 2026 00:53:08 +0000 (0:00:36.636) 0:06:04.554 ********** 2026-04-20 00:57:41.201747 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.201751 | orchestrator | 2026-04-20 00:57:41.201755 | orchestrator | TASK [ceph-mgr : Set _ceph_mgr_modules fact (convert _ceph_mgr_modules.stdout to a dict)] *** 2026-04-20 00:57:41.201758 | orchestrator | Monday 20 April 2026 00:53:09 +0000 (0:00:01.292) 0:06:05.847 ********** 2026-04-20 00:57:41.201762 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201766 | orchestrator | 2026-04-20 00:57:41.201770 | orchestrator | TASK [ceph-mgr : Set _disabled_ceph_mgr_modules fact] ************************** 2026-04-20 00:57:41.201773 | orchestrator | Monday 20 April 2026 00:53:09 +0000 (0:00:00.314) 0:06:06.162 ********** 2026-04-20 00:57:41.201777 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201781 | orchestrator | 2026-04-20 00:57:41.201784 | orchestrator | TASK [ceph-mgr : Disable ceph mgr enabled modules] ***************************** 2026-04-20 00:57:41.201788 | orchestrator | Monday 20 April 2026 00:53:09 +0000 (0:00:00.154) 0:06:06.317 ********** 2026-04-20 00:57:41.201795 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=iostat) 2026-04-20 00:57:41.201799 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=nfs) 2026-04-20 00:57:41.201803 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=restful) 2026-04-20 00:57:41.201806 | orchestrator | 2026-04-20 00:57:41.201810 | orchestrator | TASK [ceph-mgr : Add modules to ceph-mgr] ************************************** 2026-04-20 00:57:41.201814 | orchestrator | Monday 20 April 2026 00:53:16 +0000 (0:00:06.453) 0:06:12.771 ********** 2026-04-20 00:57:41.201831 | orchestrator | skipping: [testbed-node-2] => (item=balancer)  2026-04-20 00:57:41.201835 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=dashboard) 2026-04-20 00:57:41.201839 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=prometheus) 2026-04-20 00:57:41.201843 | orchestrator | skipping: [testbed-node-2] => (item=status)  2026-04-20 00:57:41.201846 | orchestrator | 2026-04-20 00:57:41.201850 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-20 00:57:41.201854 | orchestrator | Monday 20 April 2026 00:53:21 +0000 (0:00:04.668) 0:06:17.439 ********** 2026-04-20 00:57:41.201857 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201861 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201865 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201869 | orchestrator | 2026-04-20 00:57:41.201872 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2026-04-20 00:57:41.201876 | orchestrator | Monday 20 April 2026 00:53:21 +0000 (0:00:00.765) 0:06:18.205 ********** 2026-04-20 00:57:41.201880 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.201884 | orchestrator | 2026-04-20 00:57:41.201887 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2026-04-20 00:57:41.201891 | orchestrator | Monday 20 April 2026 00:53:22 +0000 (0:00:00.465) 0:06:18.671 ********** 2026-04-20 00:57:41.201895 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.201898 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.201902 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201906 | orchestrator | 2026-04-20 00:57:41.201910 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2026-04-20 00:57:41.201913 | orchestrator | Monday 20 April 2026 00:53:22 +0000 (0:00:00.258) 0:06:18.929 ********** 2026-04-20 00:57:41.201917 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.201921 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.201925 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.201928 | orchestrator | 2026-04-20 00:57:41.201932 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2026-04-20 00:57:41.201936 | orchestrator | Monday 20 April 2026 00:53:23 +0000 (0:00:01.325) 0:06:20.255 ********** 2026-04-20 00:57:41.201939 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2026-04-20 00:57:41.201943 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2026-04-20 00:57:41.201947 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2026-04-20 00:57:41.201951 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.201954 | orchestrator | 2026-04-20 00:57:41.201958 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2026-04-20 00:57:41.201962 | orchestrator | Monday 20 April 2026 00:53:24 +0000 (0:00:00.467) 0:06:20.722 ********** 2026-04-20 00:57:41.201965 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.201969 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.201973 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.201977 | orchestrator | 2026-04-20 00:57:41.201980 | orchestrator | PLAY [Apply role ceph-osd] ***************************************************** 2026-04-20 00:57:41.201984 | orchestrator | 2026-04-20 00:57:41.201992 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.201998 | orchestrator | Monday 20 April 2026 00:53:24 +0000 (0:00:00.508) 0:06:21.231 ********** 2026-04-20 00:57:41.202008 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.202062 | orchestrator | 2026-04-20 00:57:41.202068 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.202073 | orchestrator | Monday 20 April 2026 00:53:25 +0000 (0:00:00.614) 0:06:21.845 ********** 2026-04-20 00:57:41.202079 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.202085 | orchestrator | 2026-04-20 00:57:41.202090 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.202097 | orchestrator | Monday 20 April 2026 00:53:25 +0000 (0:00:00.477) 0:06:22.323 ********** 2026-04-20 00:57:41.202102 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202108 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202114 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202120 | orchestrator | 2026-04-20 00:57:41.202125 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.202131 | orchestrator | Monday 20 April 2026 00:53:26 +0000 (0:00:00.262) 0:06:22.586 ********** 2026-04-20 00:57:41.202137 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202143 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202148 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202153 | orchestrator | 2026-04-20 00:57:41.202159 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.202164 | orchestrator | Monday 20 April 2026 00:53:27 +0000 (0:00:00.847) 0:06:23.433 ********** 2026-04-20 00:57:41.202170 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202176 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202181 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202187 | orchestrator | 2026-04-20 00:57:41.202193 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.202199 | orchestrator | Monday 20 April 2026 00:53:27 +0000 (0:00:00.657) 0:06:24.090 ********** 2026-04-20 00:57:41.202205 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202234 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202240 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202246 | orchestrator | 2026-04-20 00:57:41.202252 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.202258 | orchestrator | Monday 20 April 2026 00:53:28 +0000 (0:00:00.646) 0:06:24.737 ********** 2026-04-20 00:57:41.202264 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202268 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202272 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202275 | orchestrator | 2026-04-20 00:57:41.202303 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.202308 | orchestrator | Monday 20 April 2026 00:53:28 +0000 (0:00:00.266) 0:06:25.003 ********** 2026-04-20 00:57:41.202311 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202315 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202319 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202323 | orchestrator | 2026-04-20 00:57:41.202327 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.202330 | orchestrator | Monday 20 April 2026 00:53:29 +0000 (0:00:00.455) 0:06:25.459 ********** 2026-04-20 00:57:41.202334 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202338 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202342 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202345 | orchestrator | 2026-04-20 00:57:41.202349 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.202353 | orchestrator | Monday 20 April 2026 00:53:29 +0000 (0:00:00.251) 0:06:25.711 ********** 2026-04-20 00:57:41.202357 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202360 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202370 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202373 | orchestrator | 2026-04-20 00:57:41.202377 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.202381 | orchestrator | Monday 20 April 2026 00:53:29 +0000 (0:00:00.657) 0:06:26.369 ********** 2026-04-20 00:57:41.202385 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202388 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202392 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202396 | orchestrator | 2026-04-20 00:57:41.202400 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.202403 | orchestrator | Monday 20 April 2026 00:53:30 +0000 (0:00:00.774) 0:06:27.144 ********** 2026-04-20 00:57:41.202407 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202411 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202415 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202418 | orchestrator | 2026-04-20 00:57:41.202422 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.202426 | orchestrator | Monday 20 April 2026 00:53:31 +0000 (0:00:00.461) 0:06:27.605 ********** 2026-04-20 00:57:41.202430 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202434 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202437 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202441 | orchestrator | 2026-04-20 00:57:41.202445 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.202449 | orchestrator | Monday 20 April 2026 00:53:31 +0000 (0:00:00.258) 0:06:27.864 ********** 2026-04-20 00:57:41.202452 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202456 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202460 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202463 | orchestrator | 2026-04-20 00:57:41.202467 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.202471 | orchestrator | Monday 20 April 2026 00:53:31 +0000 (0:00:00.233) 0:06:28.098 ********** 2026-04-20 00:57:41.202475 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202478 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202482 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202486 | orchestrator | 2026-04-20 00:57:41.202493 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.202497 | orchestrator | Monday 20 April 2026 00:53:32 +0000 (0:00:00.279) 0:06:28.377 ********** 2026-04-20 00:57:41.202501 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202506 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202512 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202518 | orchestrator | 2026-04-20 00:57:41.202524 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.202529 | orchestrator | Monday 20 April 2026 00:53:32 +0000 (0:00:00.447) 0:06:28.824 ********** 2026-04-20 00:57:41.202535 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202541 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202547 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202553 | orchestrator | 2026-04-20 00:57:41.202559 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.202565 | orchestrator | Monday 20 April 2026 00:53:32 +0000 (0:00:00.273) 0:06:29.098 ********** 2026-04-20 00:57:41.202571 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202578 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202584 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202590 | orchestrator | 2026-04-20 00:57:41.202597 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.202603 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.284) 0:06:29.383 ********** 2026-04-20 00:57:41.202610 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202616 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202620 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202627 | orchestrator | 2026-04-20 00:57:41.202631 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.202635 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.308) 0:06:29.691 ********** 2026-04-20 00:57:41.202638 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202642 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202646 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202650 | orchestrator | 2026-04-20 00:57:41.202653 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.202657 | orchestrator | Monday 20 April 2026 00:53:33 +0000 (0:00:00.540) 0:06:30.232 ********** 2026-04-20 00:57:41.202661 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202664 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202668 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202671 | orchestrator | 2026-04-20 00:57:41.202675 | orchestrator | TASK [ceph-osd : Set_fact add_osd] ********************************************* 2026-04-20 00:57:41.202679 | orchestrator | Monday 20 April 2026 00:53:34 +0000 (0:00:00.471) 0:06:30.703 ********** 2026-04-20 00:57:41.202682 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202686 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202690 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202693 | orchestrator | 2026-04-20 00:57:41.202697 | orchestrator | TASK [ceph-osd : Set_fact container_exec_cmd] ********************************** 2026-04-20 00:57:41.202705 | orchestrator | Monday 20 April 2026 00:53:34 +0000 (0:00:00.284) 0:06:30.987 ********** 2026-04-20 00:57:41.202709 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:57:41.202713 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:57:41.202716 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:57:41.202720 | orchestrator | 2026-04-20 00:57:41.202724 | orchestrator | TASK [ceph-osd : Include_tasks system_tuning.yml] ****************************** 2026-04-20 00:57:41.202728 | orchestrator | Monday 20 April 2026 00:53:35 +0000 (0:00:00.747) 0:06:31.734 ********** 2026-04-20 00:57:41.202731 | orchestrator | included: /ansible/roles/ceph-osd/tasks/system_tuning.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.202735 | orchestrator | 2026-04-20 00:57:41.202739 | orchestrator | TASK [ceph-osd : Create tmpfiles.d directory] ********************************** 2026-04-20 00:57:41.202743 | orchestrator | Monday 20 April 2026 00:53:35 +0000 (0:00:00.596) 0:06:32.331 ********** 2026-04-20 00:57:41.202746 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202750 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202754 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202758 | orchestrator | 2026-04-20 00:57:41.202761 | orchestrator | TASK [ceph-osd : Disable transparent hugepage] ********************************* 2026-04-20 00:57:41.202765 | orchestrator | Monday 20 April 2026 00:53:36 +0000 (0:00:00.288) 0:06:32.620 ********** 2026-04-20 00:57:41.202769 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202772 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202776 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202780 | orchestrator | 2026-04-20 00:57:41.202784 | orchestrator | TASK [ceph-osd : Get default vm.min_free_kbytes] ******************************* 2026-04-20 00:57:41.202788 | orchestrator | Monday 20 April 2026 00:53:36 +0000 (0:00:00.283) 0:06:32.904 ********** 2026-04-20 00:57:41.202791 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202795 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202799 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202803 | orchestrator | 2026-04-20 00:57:41.202806 | orchestrator | TASK [ceph-osd : Set_fact vm_min_free_kbytes] ********************************** 2026-04-20 00:57:41.202810 | orchestrator | Monday 20 April 2026 00:53:37 +0000 (0:00:00.816) 0:06:33.721 ********** 2026-04-20 00:57:41.202814 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.202818 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.202821 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.202825 | orchestrator | 2026-04-20 00:57:41.202832 | orchestrator | TASK [ceph-osd : Apply operating system tuning] ******************************** 2026-04-20 00:57:41.202836 | orchestrator | Monday 20 April 2026 00:53:37 +0000 (0:00:00.287) 0:06:34.008 ********** 2026-04-20 00:57:41.202840 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-04-20 00:57:41.202843 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-04-20 00:57:41.202853 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2026-04-20 00:57:41.202857 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-04-20 00:57:41.202861 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-04-20 00:57:41.202865 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-04-20 00:57:41.202869 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-04-20 00:57:41.202872 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.file-max', 'value': 26234859}) 2026-04-20 00:57:41.202876 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-04-20 00:57:41.202880 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2026-04-20 00:57:41.202883 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-04-20 00:57:41.202887 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 10}) 2026-04-20 00:57:41.202891 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-04-20 00:57:41.202895 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-04-20 00:57:41.202898 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2026-04-20 00:57:41.202902 | orchestrator | 2026-04-20 00:57:41.202906 | orchestrator | TASK [ceph-osd : Install dependencies] ***************************************** 2026-04-20 00:57:41.202910 | orchestrator | Monday 20 April 2026 00:53:40 +0000 (0:00:03.087) 0:06:37.096 ********** 2026-04-20 00:57:41.202913 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.202917 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.202921 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.202925 | orchestrator | 2026-04-20 00:57:41.202928 | orchestrator | TASK [ceph-osd : Include_tasks common.yml] ************************************* 2026-04-20 00:57:41.202932 | orchestrator | Monday 20 April 2026 00:53:40 +0000 (0:00:00.255) 0:06:37.352 ********** 2026-04-20 00:57:41.202936 | orchestrator | included: /ansible/roles/ceph-osd/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.202940 | orchestrator | 2026-04-20 00:57:41.202943 | orchestrator | TASK [ceph-osd : Create bootstrap-osd and osd directories] ********************* 2026-04-20 00:57:41.202947 | orchestrator | Monday 20 April 2026 00:53:41 +0000 (0:00:00.617) 0:06:37.970 ********** 2026-04-20 00:57:41.202951 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd/) 2026-04-20 00:57:41.202955 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd/) 2026-04-20 00:57:41.202965 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd/) 2026-04-20 00:57:41.202969 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/osd/) 2026-04-20 00:57:41.202973 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/osd/) 2026-04-20 00:57:41.202976 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/osd/) 2026-04-20 00:57:41.202980 | orchestrator | 2026-04-20 00:57:41.202984 | orchestrator | TASK [ceph-osd : Get keys from monitors] *************************************** 2026-04-20 00:57:41.202988 | orchestrator | Monday 20 April 2026 00:53:42 +0000 (0:00:01.020) 0:06:38.991 ********** 2026-04-20 00:57:41.202991 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.203000 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-20 00:57:41.203004 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:57:41.203007 | orchestrator | 2026-04-20 00:57:41.203011 | orchestrator | TASK [ceph-osd : Copy ceph key(s) if needed] *********************************** 2026-04-20 00:57:41.203015 | orchestrator | Monday 20 April 2026 00:53:44 +0000 (0:00:02.233) 0:06:41.224 ********** 2026-04-20 00:57:41.203019 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-20 00:57:41.203023 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-20 00:57:41.203026 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.203030 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-20 00:57:41.203034 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-04-20 00:57:41.203038 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-20 00:57:41.203041 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.203045 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-04-20 00:57:41.203049 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.203053 | orchestrator | 2026-04-20 00:57:41.203056 | orchestrator | TASK [ceph-osd : Set noup flag] ************************************************ 2026-04-20 00:57:41.203060 | orchestrator | Monday 20 April 2026 00:53:46 +0000 (0:00:01.418) 0:06:42.643 ********** 2026-04-20 00:57:41.203064 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.203067 | orchestrator | 2026-04-20 00:57:41.203071 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm.yml] ****************************** 2026-04-20 00:57:41.203075 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:02.733) 0:06:45.377 ********** 2026-04-20 00:57:41.203079 | orchestrator | included: /ansible/roles/ceph-osd/tasks/scenarios/lvm.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.203082 | orchestrator | 2026-04-20 00:57:41.203086 | orchestrator | TASK [ceph-osd : Use ceph-volume to create osds] ******************************* 2026-04-20 00:57:41.203090 | orchestrator | Monday 20 April 2026 00:53:49 +0000 (0:00:00.460) 0:06:45.837 ********** 2026-04-20 00:57:41.203094 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-4264b90b-a777-529d-80cd-078215cd7b61', 'data_vg': 'ceph-4264b90b-a777-529d-80cd-078215cd7b61'}) 2026-04-20 00:57:41.203100 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-f2b53557-bc93-5e7c-9922-524bc90e2f58', 'data_vg': 'ceph-f2b53557-bc93-5e7c-9922-524bc90e2f58'}) 2026-04-20 00:57:41.203104 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-7b8b741f-ff85-57a0-9457-c04aa474e6a9', 'data_vg': 'ceph-7b8b741f-ff85-57a0-9457-c04aa474e6a9'}) 2026-04-20 00:57:41.203108 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-0c7195b4-6e55-5dce-81dc-250aafa1626c', 'data_vg': 'ceph-0c7195b4-6e55-5dce-81dc-250aafa1626c'}) 2026-04-20 00:57:41.203112 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-575cdf11-a3b3-50b3-a6b0-c04d40287ec6', 'data_vg': 'ceph-575cdf11-a3b3-50b3-a6b0-c04d40287ec6'}) 2026-04-20 00:57:41.203116 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-a3c07e85-95b7-5759-bf4d-00aad97d3561', 'data_vg': 'ceph-a3c07e85-95b7-5759-bf4d-00aad97d3561'}) 2026-04-20 00:57:41.203120 | orchestrator | 2026-04-20 00:57:41.203123 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm-batch.yml] ************************ 2026-04-20 00:57:41.203127 | orchestrator | Monday 20 April 2026 00:54:32 +0000 (0:00:42.940) 0:07:28.778 ********** 2026-04-20 00:57:41.203131 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203135 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203138 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203142 | orchestrator | 2026-04-20 00:57:41.203146 | orchestrator | TASK [ceph-osd : Include_tasks start_osds.yml] ********************************* 2026-04-20 00:57:41.203149 | orchestrator | Monday 20 April 2026 00:54:32 +0000 (0:00:00.532) 0:07:29.310 ********** 2026-04-20 00:57:41.203153 | orchestrator | included: /ansible/roles/ceph-osd/tasks/start_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.203228 | orchestrator | 2026-04-20 00:57:41.203239 | orchestrator | TASK [ceph-osd : Get osd ids] ************************************************** 2026-04-20 00:57:41.203245 | orchestrator | Monday 20 April 2026 00:54:33 +0000 (0:00:00.488) 0:07:29.798 ********** 2026-04-20 00:57:41.203250 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.203256 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.203261 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.203268 | orchestrator | 2026-04-20 00:57:41.203274 | orchestrator | TASK [ceph-osd : Collect osd ids] ********************************************** 2026-04-20 00:57:41.203280 | orchestrator | Monday 20 April 2026 00:54:34 +0000 (0:00:00.585) 0:07:30.384 ********** 2026-04-20 00:57:41.203286 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.203293 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.203299 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.203305 | orchestrator | 2026-04-20 00:57:41.203311 | orchestrator | TASK [ceph-osd : Include_tasks systemd.yml] ************************************ 2026-04-20 00:57:41.203317 | orchestrator | Monday 20 April 2026 00:54:37 +0000 (0:00:03.016) 0:07:33.400 ********** 2026-04-20 00:57:41.203331 | orchestrator | included: /ansible/roles/ceph-osd/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.203335 | orchestrator | 2026-04-20 00:57:41.203339 | orchestrator | TASK [ceph-osd : Generate systemd unit file] *********************************** 2026-04-20 00:57:41.203342 | orchestrator | Monday 20 April 2026 00:54:37 +0000 (0:00:00.490) 0:07:33.891 ********** 2026-04-20 00:57:41.203346 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.203350 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.203353 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.203357 | orchestrator | 2026-04-20 00:57:41.203361 | orchestrator | TASK [ceph-osd : Generate systemd ceph-osd target file] ************************ 2026-04-20 00:57:41.203365 | orchestrator | Monday 20 April 2026 00:54:38 +0000 (0:00:01.095) 0:07:34.987 ********** 2026-04-20 00:57:41.203369 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.203372 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.203376 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.203380 | orchestrator | 2026-04-20 00:57:41.203383 | orchestrator | TASK [ceph-osd : Enable ceph-osd.target] *************************************** 2026-04-20 00:57:41.203387 | orchestrator | Monday 20 April 2026 00:54:39 +0000 (0:00:01.222) 0:07:36.209 ********** 2026-04-20 00:57:41.203391 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.203417 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.203421 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.203429 | orchestrator | 2026-04-20 00:57:41.203433 | orchestrator | TASK [ceph-osd : Ensure systemd service override directory exists] ************* 2026-04-20 00:57:41.203436 | orchestrator | Monday 20 April 2026 00:54:41 +0000 (0:00:01.865) 0:07:38.075 ********** 2026-04-20 00:57:41.203440 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203444 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203447 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203451 | orchestrator | 2026-04-20 00:57:41.203455 | orchestrator | TASK [ceph-osd : Add ceph-osd systemd service overrides] *********************** 2026-04-20 00:57:41.203459 | orchestrator | Monday 20 April 2026 00:54:41 +0000 (0:00:00.290) 0:07:38.365 ********** 2026-04-20 00:57:41.203463 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203466 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203470 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203474 | orchestrator | 2026-04-20 00:57:41.203477 | orchestrator | TASK [ceph-osd : Ensure /var/lib/ceph/osd/- is present] ********* 2026-04-20 00:57:41.203481 | orchestrator | Monday 20 April 2026 00:54:42 +0000 (0:00:00.316) 0:07:38.681 ********** 2026-04-20 00:57:41.203485 | orchestrator | ok: [testbed-node-3] => (item=3) 2026-04-20 00:57:41.203489 | orchestrator | ok: [testbed-node-4] => (item=5) 2026-04-20 00:57:41.203492 | orchestrator | ok: [testbed-node-5] => (item=4) 2026-04-20 00:57:41.203496 | orchestrator | ok: [testbed-node-3] => (item=2) 2026-04-20 00:57:41.203500 | orchestrator | ok: [testbed-node-4] => (item=1) 2026-04-20 00:57:41.203515 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-20 00:57:41.203522 | orchestrator | 2026-04-20 00:57:41.203531 | orchestrator | TASK [ceph-osd : Write run file in /var/lib/ceph/osd/xxxx/run] ***************** 2026-04-20 00:57:41.203537 | orchestrator | Monday 20 April 2026 00:54:43 +0000 (0:00:01.281) 0:07:39.963 ********** 2026-04-20 00:57:41.203546 | orchestrator | changed: [testbed-node-3] => (item=3) 2026-04-20 00:57:41.203552 | orchestrator | changed: [testbed-node-4] => (item=5) 2026-04-20 00:57:41.203558 | orchestrator | changed: [testbed-node-5] => (item=4) 2026-04-20 00:57:41.203563 | orchestrator | changed: [testbed-node-5] => (item=0) 2026-04-20 00:57:41.203569 | orchestrator | changed: [testbed-node-3] => (item=2) 2026-04-20 00:57:41.203575 | orchestrator | changed: [testbed-node-4] => (item=1) 2026-04-20 00:57:41.203580 | orchestrator | 2026-04-20 00:57:41.203585 | orchestrator | TASK [ceph-osd : Systemd start osd] ******************************************** 2026-04-20 00:57:41.203591 | orchestrator | Monday 20 April 2026 00:54:45 +0000 (0:00:02.381) 0:07:42.344 ********** 2026-04-20 00:57:41.203596 | orchestrator | changed: [testbed-node-3] => (item=3) 2026-04-20 00:57:41.203601 | orchestrator | changed: [testbed-node-5] => (item=4) 2026-04-20 00:57:41.203608 | orchestrator | changed: [testbed-node-4] => (item=5) 2026-04-20 00:57:41.203614 | orchestrator | changed: [testbed-node-3] => (item=2) 2026-04-20 00:57:41.203620 | orchestrator | changed: [testbed-node-5] => (item=0) 2026-04-20 00:57:41.203625 | orchestrator | changed: [testbed-node-4] => (item=1) 2026-04-20 00:57:41.203631 | orchestrator | 2026-04-20 00:57:41.203638 | orchestrator | TASK [ceph-osd : Unset noup flag] ********************************************** 2026-04-20 00:57:41.203644 | orchestrator | Monday 20 April 2026 00:54:49 +0000 (0:00:03.531) 0:07:45.876 ********** 2026-04-20 00:57:41.203649 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203656 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203663 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.203667 | orchestrator | 2026-04-20 00:57:41.203671 | orchestrator | TASK [ceph-osd : Wait for all osd to be up] ************************************ 2026-04-20 00:57:41.203675 | orchestrator | Monday 20 April 2026 00:54:52 +0000 (0:00:02.872) 0:07:48.749 ********** 2026-04-20 00:57:41.203679 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203682 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203686 | orchestrator | FAILED - RETRYING: [testbed-node-5 -> testbed-node-0]: Wait for all osd to be up (60 retries left). 2026-04-20 00:57:41.203690 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.203694 | orchestrator | 2026-04-20 00:57:41.203698 | orchestrator | TASK [ceph-osd : Include crush_rules.yml] ************************************** 2026-04-20 00:57:41.203701 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:12.626) 0:08:01.375 ********** 2026-04-20 00:57:41.203705 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203709 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203712 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203716 | orchestrator | 2026-04-20 00:57:41.203720 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-20 00:57:41.203724 | orchestrator | Monday 20 April 2026 00:55:05 +0000 (0:00:00.878) 0:08:02.254 ********** 2026-04-20 00:57:41.203727 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203731 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203739 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203743 | orchestrator | 2026-04-20 00:57:41.203747 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2026-04-20 00:57:41.203751 | orchestrator | Monday 20 April 2026 00:55:06 +0000 (0:00:00.552) 0:08:02.806 ********** 2026-04-20 00:57:41.203754 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.203758 | orchestrator | 2026-04-20 00:57:41.203762 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2026-04-20 00:57:41.203771 | orchestrator | Monday 20 April 2026 00:55:06 +0000 (0:00:00.523) 0:08:03.330 ********** 2026-04-20 00:57:41.203774 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.203778 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.203782 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.203786 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203789 | orchestrator | 2026-04-20 00:57:41.203793 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2026-04-20 00:57:41.203797 | orchestrator | Monday 20 April 2026 00:55:07 +0000 (0:00:00.374) 0:08:03.705 ********** 2026-04-20 00:57:41.203801 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203805 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203808 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203812 | orchestrator | 2026-04-20 00:57:41.203816 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2026-04-20 00:57:41.203819 | orchestrator | Monday 20 April 2026 00:55:07 +0000 (0:00:00.287) 0:08:03.993 ********** 2026-04-20 00:57:41.203823 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203827 | orchestrator | 2026-04-20 00:57:41.203831 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2026-04-20 00:57:41.203834 | orchestrator | Monday 20 April 2026 00:55:07 +0000 (0:00:00.214) 0:08:04.208 ********** 2026-04-20 00:57:41.203838 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203842 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.203846 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.203849 | orchestrator | 2026-04-20 00:57:41.203853 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2026-04-20 00:57:41.203857 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.514) 0:08:04.722 ********** 2026-04-20 00:57:41.203861 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203865 | orchestrator | 2026-04-20 00:57:41.203869 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2026-04-20 00:57:41.203872 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.226) 0:08:04.949 ********** 2026-04-20 00:57:41.203876 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203880 | orchestrator | 2026-04-20 00:57:41.203883 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2026-04-20 00:57:41.203887 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.227) 0:08:05.177 ********** 2026-04-20 00:57:41.203891 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203895 | orchestrator | 2026-04-20 00:57:41.203905 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2026-04-20 00:57:41.203911 | orchestrator | Monday 20 April 2026 00:55:08 +0000 (0:00:00.114) 0:08:05.291 ********** 2026-04-20 00:57:41.203916 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203921 | orchestrator | 2026-04-20 00:57:41.203927 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2026-04-20 00:57:41.203932 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.202) 0:08:05.494 ********** 2026-04-20 00:57:41.203937 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203942 | orchestrator | 2026-04-20 00:57:41.203948 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2026-04-20 00:57:41.203953 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.253) 0:08:05.747 ********** 2026-04-20 00:57:41.203959 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.203964 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.203969 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.203975 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.203982 | orchestrator | 2026-04-20 00:57:41.203988 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2026-04-20 00:57:41.203994 | orchestrator | Monday 20 April 2026 00:55:09 +0000 (0:00:00.421) 0:08:06.168 ********** 2026-04-20 00:57:41.204005 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204010 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204016 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204021 | orchestrator | 2026-04-20 00:57:41.204027 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2026-04-20 00:57:41.204033 | orchestrator | Monday 20 April 2026 00:55:10 +0000 (0:00:00.333) 0:08:06.502 ********** 2026-04-20 00:57:41.204038 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204044 | orchestrator | 2026-04-20 00:57:41.204050 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2026-04-20 00:57:41.204055 | orchestrator | Monday 20 April 2026 00:55:10 +0000 (0:00:00.783) 0:08:07.286 ********** 2026-04-20 00:57:41.204061 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204066 | orchestrator | 2026-04-20 00:57:41.204072 | orchestrator | PLAY [Apply role ceph-crash] *************************************************** 2026-04-20 00:57:41.204077 | orchestrator | 2026-04-20 00:57:41.204083 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.204090 | orchestrator | Monday 20 April 2026 00:55:11 +0000 (0:00:00.694) 0:08:07.980 ********** 2026-04-20 00:57:41.204096 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.204105 | orchestrator | 2026-04-20 00:57:41.204112 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.204124 | orchestrator | Monday 20 April 2026 00:55:12 +0000 (0:00:01.262) 0:08:09.243 ********** 2026-04-20 00:57:41.204131 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.204138 | orchestrator | 2026-04-20 00:57:41.204144 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.204150 | orchestrator | Monday 20 April 2026 00:55:14 +0000 (0:00:01.184) 0:08:10.428 ********** 2026-04-20 00:57:41.204156 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204162 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204168 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204174 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204180 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204186 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204192 | orchestrator | 2026-04-20 00:57:41.204198 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.204205 | orchestrator | Monday 20 April 2026 00:55:15 +0000 (0:00:01.141) 0:08:11.569 ********** 2026-04-20 00:57:41.204227 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204233 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204237 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204241 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204244 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204248 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204252 | orchestrator | 2026-04-20 00:57:41.204256 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.204260 | orchestrator | Monday 20 April 2026 00:55:15 +0000 (0:00:00.676) 0:08:12.245 ********** 2026-04-20 00:57:41.204264 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204267 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204271 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204275 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204279 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204282 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204286 | orchestrator | 2026-04-20 00:57:41.204290 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.204294 | orchestrator | Monday 20 April 2026 00:55:16 +0000 (0:00:00.654) 0:08:12.900 ********** 2026-04-20 00:57:41.204297 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204306 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204310 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204313 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204317 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204321 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204324 | orchestrator | 2026-04-20 00:57:41.204328 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.204332 | orchestrator | Monday 20 April 2026 00:55:17 +0000 (0:00:00.993) 0:08:13.893 ********** 2026-04-20 00:57:41.204336 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204339 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204343 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204347 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204351 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204354 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204358 | orchestrator | 2026-04-20 00:57:41.204365 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.204369 | orchestrator | Monday 20 April 2026 00:55:18 +0000 (0:00:00.788) 0:08:14.682 ********** 2026-04-20 00:57:41.204373 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204376 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204380 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204384 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204387 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204391 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204395 | orchestrator | 2026-04-20 00:57:41.204399 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.204402 | orchestrator | Monday 20 April 2026 00:55:18 +0000 (0:00:00.566) 0:08:15.248 ********** 2026-04-20 00:57:41.204406 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204410 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204414 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204417 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204421 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204425 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204428 | orchestrator | 2026-04-20 00:57:41.204432 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.204436 | orchestrator | Monday 20 April 2026 00:55:19 +0000 (0:00:00.416) 0:08:15.665 ********** 2026-04-20 00:57:41.204440 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204444 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204447 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204451 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204455 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204459 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204462 | orchestrator | 2026-04-20 00:57:41.204466 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.204470 | orchestrator | Monday 20 April 2026 00:55:20 +0000 (0:00:01.052) 0:08:16.718 ********** 2026-04-20 00:57:41.204474 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204477 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204481 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204485 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204489 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204492 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204496 | orchestrator | 2026-04-20 00:57:41.204500 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.204504 | orchestrator | Monday 20 April 2026 00:55:21 +0000 (0:00:00.889) 0:08:17.608 ********** 2026-04-20 00:57:41.204507 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204511 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204515 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204519 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204522 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204530 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204534 | orchestrator | 2026-04-20 00:57:41.204537 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.204545 | orchestrator | Monday 20 April 2026 00:55:21 +0000 (0:00:00.647) 0:08:18.255 ********** 2026-04-20 00:57:41.204549 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204552 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204556 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204560 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204563 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204567 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204571 | orchestrator | 2026-04-20 00:57:41.204575 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.204578 | orchestrator | Monday 20 April 2026 00:55:22 +0000 (0:00:00.555) 0:08:18.810 ********** 2026-04-20 00:57:41.204582 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204586 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204590 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204593 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204597 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204601 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204605 | orchestrator | 2026-04-20 00:57:41.204608 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.204612 | orchestrator | Monday 20 April 2026 00:55:23 +0000 (0:00:00.838) 0:08:19.649 ********** 2026-04-20 00:57:41.204616 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204620 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204623 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204627 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204631 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204634 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204638 | orchestrator | 2026-04-20 00:57:41.204642 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.204646 | orchestrator | Monday 20 April 2026 00:55:23 +0000 (0:00:00.606) 0:08:20.255 ********** 2026-04-20 00:57:41.204649 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204653 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204657 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204661 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204664 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204668 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204672 | orchestrator | 2026-04-20 00:57:41.204677 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.204683 | orchestrator | Monday 20 April 2026 00:55:24 +0000 (0:00:00.797) 0:08:21.053 ********** 2026-04-20 00:57:41.204689 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204694 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204701 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204707 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204713 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204720 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204725 | orchestrator | 2026-04-20 00:57:41.204728 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.204732 | orchestrator | Monday 20 April 2026 00:55:25 +0000 (0:00:00.579) 0:08:21.632 ********** 2026-04-20 00:57:41.204736 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204739 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204743 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204747 | orchestrator | skipping: [testbed-node-0] 2026-04-20 00:57:41.204750 | orchestrator | skipping: [testbed-node-1] 2026-04-20 00:57:41.204758 | orchestrator | skipping: [testbed-node-2] 2026-04-20 00:57:41.204761 | orchestrator | 2026-04-20 00:57:41.204765 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.204769 | orchestrator | Monday 20 April 2026 00:55:26 +0000 (0:00:00.782) 0:08:22.415 ********** 2026-04-20 00:57:41.204776 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.204780 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.204784 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.204787 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204791 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204795 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204799 | orchestrator | 2026-04-20 00:57:41.204802 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.204806 | orchestrator | Monday 20 April 2026 00:55:26 +0000 (0:00:00.641) 0:08:23.057 ********** 2026-04-20 00:57:41.204810 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204813 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204817 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204821 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204824 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204828 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204832 | orchestrator | 2026-04-20 00:57:41.204835 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.204839 | orchestrator | Monday 20 April 2026 00:55:27 +0000 (0:00:00.830) 0:08:23.888 ********** 2026-04-20 00:57:41.204843 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.204847 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.204850 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.204854 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204857 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.204861 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.204865 | orchestrator | 2026-04-20 00:57:41.204869 | orchestrator | TASK [ceph-crash : Create client.crash keyring] ******************************** 2026-04-20 00:57:41.204872 | orchestrator | Monday 20 April 2026 00:55:28 +0000 (0:00:01.011) 0:08:24.899 ********** 2026-04-20 00:57:41.204876 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.204880 | orchestrator | 2026-04-20 00:57:41.204883 | orchestrator | TASK [ceph-crash : Get keys from monitors] ************************************* 2026-04-20 00:57:41.204887 | orchestrator | Monday 20 April 2026 00:55:31 +0000 (0:00:03.160) 0:08:28.060 ********** 2026-04-20 00:57:41.204891 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.204894 | orchestrator | 2026-04-20 00:57:41.204898 | orchestrator | TASK [ceph-crash : Copy ceph key(s) if needed] ********************************* 2026-04-20 00:57:41.204902 | orchestrator | Monday 20 April 2026 00:55:33 +0000 (0:00:01.919) 0:08:29.980 ********** 2026-04-20 00:57:41.204906 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.204909 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.204913 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.204932 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.204936 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.204940 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.204943 | orchestrator | 2026-04-20 00:57:41.204950 | orchestrator | TASK [ceph-crash : Create /var/lib/ceph/crash/posted] ************************** 2026-04-20 00:57:41.204954 | orchestrator | Monday 20 April 2026 00:55:35 +0000 (0:00:01.394) 0:08:31.374 ********** 2026-04-20 00:57:41.204958 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.204962 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.204965 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.204969 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.204973 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.204978 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.204984 | orchestrator | 2026-04-20 00:57:41.204990 | orchestrator | TASK [ceph-crash : Include_tasks systemd.yml] ********************************** 2026-04-20 00:57:41.204996 | orchestrator | Monday 20 April 2026 00:55:36 +0000 (0:00:01.210) 0:08:32.585 ********** 2026-04-20 00:57:41.205002 | orchestrator | included: /ansible/roles/ceph-crash/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.205015 | orchestrator | 2026-04-20 00:57:41.205022 | orchestrator | TASK [ceph-crash : Generate systemd unit file for ceph-crash container] ******** 2026-04-20 00:57:41.205027 | orchestrator | Monday 20 April 2026 00:55:37 +0000 (0:00:01.187) 0:08:33.772 ********** 2026-04-20 00:57:41.205033 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.205039 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.205045 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.205051 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.205057 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.205063 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.205069 | orchestrator | 2026-04-20 00:57:41.205076 | orchestrator | TASK [ceph-crash : Start the ceph-crash service] ******************************* 2026-04-20 00:57:41.205082 | orchestrator | Monday 20 April 2026 00:55:38 +0000 (0:00:01.465) 0:08:35.237 ********** 2026-04-20 00:57:41.205088 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.205095 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.205101 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.205107 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.205113 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.205118 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.205125 | orchestrator | 2026-04-20 00:57:41.205131 | orchestrator | RUNNING HANDLER [ceph-handler : Ceph crash handler] **************************** 2026-04-20 00:57:41.205137 | orchestrator | Monday 20 April 2026 00:55:42 +0000 (0:00:03.414) 0:08:38.652 ********** 2026-04-20 00:57:41.205144 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_crash.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:57:41.205150 | orchestrator | 2026-04-20 00:57:41.205156 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called before restart] ****** 2026-04-20 00:57:41.205163 | orchestrator | Monday 20 April 2026 00:55:43 +0000 (0:00:01.413) 0:08:40.065 ********** 2026-04-20 00:57:41.205169 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205175 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205179 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205183 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.205186 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.205197 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.205201 | orchestrator | 2026-04-20 00:57:41.205205 | orchestrator | RUNNING HANDLER [ceph-handler : Restart the ceph-crash service] **************** 2026-04-20 00:57:41.205262 | orchestrator | Monday 20 April 2026 00:55:44 +0000 (0:00:00.593) 0:08:40.659 ********** 2026-04-20 00:57:41.205269 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.205275 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.205281 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.205287 | orchestrator | changed: [testbed-node-0] 2026-04-20 00:57:41.205293 | orchestrator | changed: [testbed-node-1] 2026-04-20 00:57:41.205298 | orchestrator | changed: [testbed-node-2] 2026-04-20 00:57:41.205305 | orchestrator | 2026-04-20 00:57:41.205310 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called after restart] ******* 2026-04-20 00:57:41.205316 | orchestrator | Monday 20 April 2026 00:55:46 +0000 (0:00:02.525) 0:08:43.184 ********** 2026-04-20 00:57:41.205321 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205327 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205333 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205339 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:57:41.205345 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:57:41.205351 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:57:41.205358 | orchestrator | 2026-04-20 00:57:41.205364 | orchestrator | PLAY [Apply role ceph-mds] ***************************************************** 2026-04-20 00:57:41.205369 | orchestrator | 2026-04-20 00:57:41.205376 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.205382 | orchestrator | Monday 20 April 2026 00:55:47 +0000 (0:00:00.644) 0:08:43.828 ********** 2026-04-20 00:57:41.205386 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.205395 | orchestrator | 2026-04-20 00:57:41.205399 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.205403 | orchestrator | Monday 20 April 2026 00:55:47 +0000 (0:00:00.478) 0:08:44.307 ********** 2026-04-20 00:57:41.205407 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.205411 | orchestrator | 2026-04-20 00:57:41.205415 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.205419 | orchestrator | Monday 20 April 2026 00:55:48 +0000 (0:00:00.372) 0:08:44.679 ********** 2026-04-20 00:57:41.205422 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205426 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205430 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205434 | orchestrator | 2026-04-20 00:57:41.205437 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.205441 | orchestrator | Monday 20 April 2026 00:55:48 +0000 (0:00:00.315) 0:08:44.995 ********** 2026-04-20 00:57:41.205445 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205449 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205453 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205456 | orchestrator | 2026-04-20 00:57:41.205467 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.205471 | orchestrator | Monday 20 April 2026 00:55:49 +0000 (0:00:00.585) 0:08:45.581 ********** 2026-04-20 00:57:41.205475 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205478 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205482 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205486 | orchestrator | 2026-04-20 00:57:41.205489 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.205493 | orchestrator | Monday 20 April 2026 00:55:49 +0000 (0:00:00.596) 0:08:46.178 ********** 2026-04-20 00:57:41.205497 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205501 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205504 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205508 | orchestrator | 2026-04-20 00:57:41.205512 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.205515 | orchestrator | Monday 20 April 2026 00:55:50 +0000 (0:00:00.662) 0:08:46.840 ********** 2026-04-20 00:57:41.205519 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205523 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205526 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205530 | orchestrator | 2026-04-20 00:57:41.205534 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.205538 | orchestrator | Monday 20 April 2026 00:55:50 +0000 (0:00:00.417) 0:08:47.258 ********** 2026-04-20 00:57:41.205541 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205545 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205549 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205552 | orchestrator | 2026-04-20 00:57:41.205556 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.205560 | orchestrator | Monday 20 April 2026 00:55:51 +0000 (0:00:00.281) 0:08:47.540 ********** 2026-04-20 00:57:41.205563 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205567 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205571 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205575 | orchestrator | 2026-04-20 00:57:41.205578 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.205582 | orchestrator | Monday 20 April 2026 00:55:51 +0000 (0:00:00.260) 0:08:47.800 ********** 2026-04-20 00:57:41.205586 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205590 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205593 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205597 | orchestrator | 2026-04-20 00:57:41.205601 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.205610 | orchestrator | Monday 20 April 2026 00:55:52 +0000 (0:00:00.599) 0:08:48.399 ********** 2026-04-20 00:57:41.205613 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205617 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205621 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205624 | orchestrator | 2026-04-20 00:57:41.205628 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.205632 | orchestrator | Monday 20 April 2026 00:55:52 +0000 (0:00:00.904) 0:08:49.304 ********** 2026-04-20 00:57:41.205635 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205639 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205647 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205651 | orchestrator | 2026-04-20 00:57:41.205655 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.205658 | orchestrator | Monday 20 April 2026 00:55:53 +0000 (0:00:00.298) 0:08:49.602 ********** 2026-04-20 00:57:41.205662 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205666 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205669 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205673 | orchestrator | 2026-04-20 00:57:41.205677 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.205680 | orchestrator | Monday 20 April 2026 00:55:53 +0000 (0:00:00.332) 0:08:49.934 ********** 2026-04-20 00:57:41.205684 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205688 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205692 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205696 | orchestrator | 2026-04-20 00:57:41.205699 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.205703 | orchestrator | Monday 20 April 2026 00:55:53 +0000 (0:00:00.308) 0:08:50.243 ********** 2026-04-20 00:57:41.205707 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205710 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205714 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205718 | orchestrator | 2026-04-20 00:57:41.205721 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.205725 | orchestrator | Monday 20 April 2026 00:55:54 +0000 (0:00:00.579) 0:08:50.822 ********** 2026-04-20 00:57:41.205729 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205733 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205736 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205740 | orchestrator | 2026-04-20 00:57:41.205744 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.205747 | orchestrator | Monday 20 April 2026 00:55:54 +0000 (0:00:00.348) 0:08:51.171 ********** 2026-04-20 00:57:41.205751 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205755 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205759 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205762 | orchestrator | 2026-04-20 00:57:41.205766 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.205770 | orchestrator | Monday 20 April 2026 00:55:55 +0000 (0:00:00.338) 0:08:51.510 ********** 2026-04-20 00:57:41.205774 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205777 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205781 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205785 | orchestrator | 2026-04-20 00:57:41.205788 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.205792 | orchestrator | Monday 20 April 2026 00:55:55 +0000 (0:00:00.291) 0:08:51.801 ********** 2026-04-20 00:57:41.205796 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205800 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205803 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205807 | orchestrator | 2026-04-20 00:57:41.205811 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.205818 | orchestrator | Monday 20 April 2026 00:55:55 +0000 (0:00:00.564) 0:08:52.366 ********** 2026-04-20 00:57:41.205827 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205830 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205834 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205838 | orchestrator | 2026-04-20 00:57:41.205842 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.205846 | orchestrator | Monday 20 April 2026 00:55:56 +0000 (0:00:00.339) 0:08:52.706 ********** 2026-04-20 00:57:41.205849 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.205853 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.205857 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.205860 | orchestrator | 2026-04-20 00:57:41.205864 | orchestrator | TASK [ceph-mds : Include create_mds_filesystems.yml] *************************** 2026-04-20 00:57:41.205868 | orchestrator | Monday 20 April 2026 00:55:56 +0000 (0:00:00.565) 0:08:53.272 ********** 2026-04-20 00:57:41.205872 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.205875 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.205879 | orchestrator | included: /ansible/roles/ceph-mds/tasks/create_mds_filesystems.yml for testbed-node-3 2026-04-20 00:57:41.205884 | orchestrator | 2026-04-20 00:57:41.205888 | orchestrator | TASK [ceph-facts : Get current default crush rule details] ********************* 2026-04-20 00:57:41.205892 | orchestrator | Monday 20 April 2026 00:55:57 +0000 (0:00:00.820) 0:08:54.092 ********** 2026-04-20 00:57:41.205895 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.205899 | orchestrator | 2026-04-20 00:57:41.205903 | orchestrator | TASK [ceph-facts : Get current default crush rule name] ************************ 2026-04-20 00:57:41.205907 | orchestrator | Monday 20 April 2026 00:56:00 +0000 (0:00:02.371) 0:08:56.463 ********** 2026-04-20 00:57:41.205912 | orchestrator | skipping: [testbed-node-3] => (item={'rule_id': 0, 'rule_name': 'replicated_rule', 'type': 1, 'steps': [{'op': 'take', 'item': -1, 'item_name': 'default'}, {'op': 'chooseleaf_firstn', 'num': 0, 'type': 'host'}, {'op': 'emit'}]})  2026-04-20 00:57:41.205919 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.205922 | orchestrator | 2026-04-20 00:57:41.205926 | orchestrator | TASK [ceph-mds : Create filesystem pools] ************************************** 2026-04-20 00:57:41.205930 | orchestrator | Monday 20 April 2026 00:56:00 +0000 (0:00:00.232) 0:08:56.696 ********** 2026-04-20 00:57:41.205936 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_data', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:57:41.205948 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_metadata', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:57:41.205952 | orchestrator | 2026-04-20 00:57:41.205956 | orchestrator | TASK [ceph-mds : Create ceph filesystem] *************************************** 2026-04-20 00:57:41.205960 | orchestrator | Monday 20 April 2026 00:56:06 +0000 (0:00:06.266) 0:09:02.963 ********** 2026-04-20 00:57:41.205964 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2026-04-20 00:57:41.205967 | orchestrator | 2026-04-20 00:57:41.205971 | orchestrator | TASK [ceph-mds : Include common.yml] ******************************************* 2026-04-20 00:57:41.205975 | orchestrator | Monday 20 April 2026 00:56:10 +0000 (0:00:03.736) 0:09:06.700 ********** 2026-04-20 00:57:41.205978 | orchestrator | included: /ansible/roles/ceph-mds/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.205982 | orchestrator | 2026-04-20 00:57:41.205986 | orchestrator | TASK [ceph-mds : Create bootstrap-mds and mds directories] ********************* 2026-04-20 00:57:41.205990 | orchestrator | Monday 20 April 2026 00:56:11 +0000 (0:00:00.751) 0:09:07.451 ********** 2026-04-20 00:57:41.205994 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds/) 2026-04-20 00:57:41.206001 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds/) 2026-04-20 00:57:41.206005 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds/) 2026-04-20 00:57:41.206009 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds/ceph-testbed-node-3) 2026-04-20 00:57:41.206063 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds/ceph-testbed-node-4) 2026-04-20 00:57:41.206068 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds/ceph-testbed-node-5) 2026-04-20 00:57:41.206072 | orchestrator | 2026-04-20 00:57:41.206076 | orchestrator | TASK [ceph-mds : Get keys from monitors] *************************************** 2026-04-20 00:57:41.206079 | orchestrator | Monday 20 April 2026 00:56:12 +0000 (0:00:01.099) 0:09:08.551 ********** 2026-04-20 00:57:41.206083 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.206087 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-20 00:57:41.206091 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:57:41.206095 | orchestrator | 2026-04-20 00:57:41.206099 | orchestrator | TASK [ceph-mds : Copy ceph key(s) if needed] *********************************** 2026-04-20 00:57:41.206103 | orchestrator | Monday 20 April 2026 00:56:14 +0000 (0:00:02.109) 0:09:10.661 ********** 2026-04-20 00:57:41.206106 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-20 00:57:41.206110 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-20 00:57:41.206114 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206118 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-20 00:57:41.206121 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-04-20 00:57:41.206129 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206133 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-20 00:57:41.206137 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-04-20 00:57:41.206141 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206144 | orchestrator | 2026-04-20 00:57:41.206148 | orchestrator | TASK [ceph-mds : Create mds keyring] ******************************************* 2026-04-20 00:57:41.206152 | orchestrator | Monday 20 April 2026 00:56:15 +0000 (0:00:01.221) 0:09:11.882 ********** 2026-04-20 00:57:41.206155 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206159 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206163 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206166 | orchestrator | 2026-04-20 00:57:41.206170 | orchestrator | TASK [ceph-mds : Non_containerized.yml] **************************************** 2026-04-20 00:57:41.206174 | orchestrator | Monday 20 April 2026 00:56:18 +0000 (0:00:02.651) 0:09:14.534 ********** 2026-04-20 00:57:41.206178 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206181 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206185 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206189 | orchestrator | 2026-04-20 00:57:41.206192 | orchestrator | TASK [ceph-mds : Containerized.yml] ******************************************** 2026-04-20 00:57:41.206196 | orchestrator | Monday 20 April 2026 00:56:18 +0000 (0:00:00.412) 0:09:14.946 ********** 2026-04-20 00:57:41.206200 | orchestrator | included: /ansible/roles/ceph-mds/tasks/containerized.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.206206 | orchestrator | 2026-04-20 00:57:41.206233 | orchestrator | TASK [ceph-mds : Include_tasks systemd.yml] ************************************ 2026-04-20 00:57:41.206238 | orchestrator | Monday 20 April 2026 00:56:19 +0000 (0:00:00.463) 0:09:15.410 ********** 2026-04-20 00:57:41.206244 | orchestrator | included: /ansible/roles/ceph-mds/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.206249 | orchestrator | 2026-04-20 00:57:41.206254 | orchestrator | TASK [ceph-mds : Generate systemd unit file] *********************************** 2026-04-20 00:57:41.206260 | orchestrator | Monday 20 April 2026 00:56:19 +0000 (0:00:00.573) 0:09:15.983 ********** 2026-04-20 00:57:41.206265 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206272 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206283 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206289 | orchestrator | 2026-04-20 00:57:41.206295 | orchestrator | TASK [ceph-mds : Generate systemd ceph-mds target file] ************************ 2026-04-20 00:57:41.206301 | orchestrator | Monday 20 April 2026 00:56:20 +0000 (0:00:01.190) 0:09:17.173 ********** 2026-04-20 00:57:41.206308 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206314 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206320 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206326 | orchestrator | 2026-04-20 00:57:41.206333 | orchestrator | TASK [ceph-mds : Enable ceph-mds.target] *************************************** 2026-04-20 00:57:41.206337 | orchestrator | Monday 20 April 2026 00:56:22 +0000 (0:00:01.257) 0:09:18.430 ********** 2026-04-20 00:57:41.206341 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206345 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206348 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206352 | orchestrator | 2026-04-20 00:57:41.206360 | orchestrator | TASK [ceph-mds : Systemd start mds container] ********************************** 2026-04-20 00:57:41.206366 | orchestrator | Monday 20 April 2026 00:56:23 +0000 (0:00:01.839) 0:09:20.270 ********** 2026-04-20 00:57:41.206372 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206377 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206384 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206390 | orchestrator | 2026-04-20 00:57:41.206396 | orchestrator | TASK [ceph-mds : Wait for mds socket to exist] ********************************* 2026-04-20 00:57:41.206402 | orchestrator | Monday 20 April 2026 00:56:26 +0000 (0:00:02.203) 0:09:22.473 ********** 2026-04-20 00:57:41.206409 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206413 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206417 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206421 | orchestrator | 2026-04-20 00:57:41.206425 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-20 00:57:41.206428 | orchestrator | Monday 20 April 2026 00:56:27 +0000 (0:00:01.181) 0:09:23.655 ********** 2026-04-20 00:57:41.206432 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206436 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206440 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206443 | orchestrator | 2026-04-20 00:57:41.206447 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2026-04-20 00:57:41.206452 | orchestrator | Monday 20 April 2026 00:56:28 +0000 (0:00:00.929) 0:09:24.585 ********** 2026-04-20 00:57:41.206459 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.206465 | orchestrator | 2026-04-20 00:57:41.206471 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2026-04-20 00:57:41.206477 | orchestrator | Monday 20 April 2026 00:56:28 +0000 (0:00:00.443) 0:09:25.028 ********** 2026-04-20 00:57:41.206483 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206489 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206495 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206501 | orchestrator | 2026-04-20 00:57:41.206507 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2026-04-20 00:57:41.206513 | orchestrator | Monday 20 April 2026 00:56:28 +0000 (0:00:00.258) 0:09:25.286 ********** 2026-04-20 00:57:41.206521 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.206525 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.206528 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.206532 | orchestrator | 2026-04-20 00:57:41.206536 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2026-04-20 00:57:41.206540 | orchestrator | Monday 20 April 2026 00:56:30 +0000 (0:00:01.557) 0:09:26.843 ********** 2026-04-20 00:57:41.206543 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.206547 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.206551 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.206562 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206566 | orchestrator | 2026-04-20 00:57:41.206570 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2026-04-20 00:57:41.206574 | orchestrator | Monday 20 April 2026 00:56:31 +0000 (0:00:00.601) 0:09:27.445 ********** 2026-04-20 00:57:41.206577 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206581 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206585 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206589 | orchestrator | 2026-04-20 00:57:41.206592 | orchestrator | PLAY [Apply role ceph-rgw] ***************************************************** 2026-04-20 00:57:41.206596 | orchestrator | 2026-04-20 00:57:41.206600 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2026-04-20 00:57:41.206604 | orchestrator | Monday 20 April 2026 00:56:31 +0000 (0:00:00.502) 0:09:27.947 ********** 2026-04-20 00:57:41.206607 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.206612 | orchestrator | 2026-04-20 00:57:41.206616 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2026-04-20 00:57:41.206619 | orchestrator | Monday 20 April 2026 00:56:32 +0000 (0:00:00.686) 0:09:28.634 ********** 2026-04-20 00:57:41.206623 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.206627 | orchestrator | 2026-04-20 00:57:41.206631 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2026-04-20 00:57:41.206637 | orchestrator | Monday 20 April 2026 00:56:32 +0000 (0:00:00.500) 0:09:29.134 ********** 2026-04-20 00:57:41.206643 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206648 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206657 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206664 | orchestrator | 2026-04-20 00:57:41.206670 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2026-04-20 00:57:41.206675 | orchestrator | Monday 20 April 2026 00:56:33 +0000 (0:00:00.284) 0:09:29.418 ********** 2026-04-20 00:57:41.206681 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206687 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206693 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206699 | orchestrator | 2026-04-20 00:57:41.206704 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2026-04-20 00:57:41.206710 | orchestrator | Monday 20 April 2026 00:56:34 +0000 (0:00:00.959) 0:09:30.378 ********** 2026-04-20 00:57:41.206715 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206720 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206726 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206731 | orchestrator | 2026-04-20 00:57:41.206737 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2026-04-20 00:57:41.206784 | orchestrator | Monday 20 April 2026 00:56:34 +0000 (0:00:00.741) 0:09:31.119 ********** 2026-04-20 00:57:41.206794 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206800 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206806 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206813 | orchestrator | 2026-04-20 00:57:41.206817 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2026-04-20 00:57:41.206825 | orchestrator | Monday 20 April 2026 00:56:35 +0000 (0:00:00.726) 0:09:31.846 ********** 2026-04-20 00:57:41.206829 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206833 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206837 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206841 | orchestrator | 2026-04-20 00:57:41.206844 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2026-04-20 00:57:41.206865 | orchestrator | Monday 20 April 2026 00:56:35 +0000 (0:00:00.297) 0:09:32.143 ********** 2026-04-20 00:57:41.206869 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206873 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206877 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206885 | orchestrator | 2026-04-20 00:57:41.206889 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2026-04-20 00:57:41.206893 | orchestrator | Monday 20 April 2026 00:56:36 +0000 (0:00:00.555) 0:09:32.699 ********** 2026-04-20 00:57:41.206897 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206900 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206904 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206908 | orchestrator | 2026-04-20 00:57:41.206911 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2026-04-20 00:57:41.206915 | orchestrator | Monday 20 April 2026 00:56:36 +0000 (0:00:00.285) 0:09:32.984 ********** 2026-04-20 00:57:41.206919 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206923 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206927 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206930 | orchestrator | 2026-04-20 00:57:41.206934 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2026-04-20 00:57:41.206938 | orchestrator | Monday 20 April 2026 00:56:37 +0000 (0:00:00.759) 0:09:33.744 ********** 2026-04-20 00:57:41.206942 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.206946 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.206949 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.206953 | orchestrator | 2026-04-20 00:57:41.206957 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2026-04-20 00:57:41.206960 | orchestrator | Monday 20 April 2026 00:56:38 +0000 (0:00:00.697) 0:09:34.441 ********** 2026-04-20 00:57:41.206964 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206968 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206972 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206975 | orchestrator | 2026-04-20 00:57:41.206979 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2026-04-20 00:57:41.206983 | orchestrator | Monday 20 April 2026 00:56:38 +0000 (0:00:00.451) 0:09:34.893 ********** 2026-04-20 00:57:41.206986 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.206990 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.206994 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.206997 | orchestrator | 2026-04-20 00:57:41.207001 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2026-04-20 00:57:41.207005 | orchestrator | Monday 20 April 2026 00:56:38 +0000 (0:00:00.260) 0:09:35.153 ********** 2026-04-20 00:57:41.207014 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207018 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207022 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207025 | orchestrator | 2026-04-20 00:57:41.207029 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2026-04-20 00:57:41.207033 | orchestrator | Monday 20 April 2026 00:56:39 +0000 (0:00:00.275) 0:09:35.428 ********** 2026-04-20 00:57:41.207037 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207040 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207044 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207048 | orchestrator | 2026-04-20 00:57:41.207052 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2026-04-20 00:57:41.207055 | orchestrator | Monday 20 April 2026 00:56:39 +0000 (0:00:00.264) 0:09:35.693 ********** 2026-04-20 00:57:41.207059 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207063 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207067 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207071 | orchestrator | 2026-04-20 00:57:41.207074 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2026-04-20 00:57:41.207078 | orchestrator | Monday 20 April 2026 00:56:39 +0000 (0:00:00.446) 0:09:36.139 ********** 2026-04-20 00:57:41.207082 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207085 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207089 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207093 | orchestrator | 2026-04-20 00:57:41.207097 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2026-04-20 00:57:41.207111 | orchestrator | Monday 20 April 2026 00:56:40 +0000 (0:00:00.273) 0:09:36.413 ********** 2026-04-20 00:57:41.207115 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207119 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207122 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207126 | orchestrator | 2026-04-20 00:57:41.207130 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2026-04-20 00:57:41.207134 | orchestrator | Monday 20 April 2026 00:56:40 +0000 (0:00:00.239) 0:09:36.652 ********** 2026-04-20 00:57:41.207137 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207141 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207145 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207149 | orchestrator | 2026-04-20 00:57:41.207152 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2026-04-20 00:57:41.207156 | orchestrator | Monday 20 April 2026 00:56:40 +0000 (0:00:00.261) 0:09:36.914 ********** 2026-04-20 00:57:41.207160 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207164 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207167 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207171 | orchestrator | 2026-04-20 00:57:41.207175 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2026-04-20 00:57:41.207179 | orchestrator | Monday 20 April 2026 00:56:40 +0000 (0:00:00.440) 0:09:37.354 ********** 2026-04-20 00:57:41.207190 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207194 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207198 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207202 | orchestrator | 2026-04-20 00:57:41.207206 | orchestrator | TASK [ceph-rgw : Include common.yml] ******************************************* 2026-04-20 00:57:41.207222 | orchestrator | Monday 20 April 2026 00:56:41 +0000 (0:00:00.463) 0:09:37.818 ********** 2026-04-20 00:57:41.207230 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.207234 | orchestrator | 2026-04-20 00:57:41.207238 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2026-04-20 00:57:41.207242 | orchestrator | Monday 20 April 2026 00:56:42 +0000 (0:00:00.652) 0:09:38.470 ********** 2026-04-20 00:57:41.207245 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207249 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-20 00:57:41.207253 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:57:41.207257 | orchestrator | 2026-04-20 00:57:41.207260 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2026-04-20 00:57:41.207264 | orchestrator | Monday 20 April 2026 00:56:44 +0000 (0:00:02.102) 0:09:40.573 ********** 2026-04-20 00:57:41.207268 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-20 00:57:41.207272 | orchestrator | skipping: [testbed-node-3] => (item=None)  2026-04-20 00:57:41.207275 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.207279 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-20 00:57:41.207283 | orchestrator | skipping: [testbed-node-4] => (item=None)  2026-04-20 00:57:41.207286 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.207290 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-20 00:57:41.207294 | orchestrator | skipping: [testbed-node-5] => (item=None)  2026-04-20 00:57:41.207298 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.207301 | orchestrator | 2026-04-20 00:57:41.207305 | orchestrator | TASK [ceph-rgw : Copy SSL certificate & key data to certificate path] ********** 2026-04-20 00:57:41.207309 | orchestrator | Monday 20 April 2026 00:56:45 +0000 (0:00:01.145) 0:09:41.718 ********** 2026-04-20 00:57:41.207312 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207316 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207320 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207323 | orchestrator | 2026-04-20 00:57:41.207327 | orchestrator | TASK [ceph-rgw : Include_tasks pre_requisite.yml] ****************************** 2026-04-20 00:57:41.207334 | orchestrator | Monday 20 April 2026 00:56:45 +0000 (0:00:00.273) 0:09:41.992 ********** 2026-04-20 00:57:41.207338 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/pre_requisite.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.207342 | orchestrator | 2026-04-20 00:57:41.207345 | orchestrator | TASK [ceph-rgw : Create rados gateway directories] ***************************** 2026-04-20 00:57:41.207349 | orchestrator | Monday 20 April 2026 00:56:46 +0000 (0:00:00.608) 0:09:42.601 ********** 2026-04-20 00:57:41.207353 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.207362 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.207366 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.207370 | orchestrator | 2026-04-20 00:57:41.207374 | orchestrator | TASK [ceph-rgw : Create rgw keyrings] ****************************************** 2026-04-20 00:57:41.207377 | orchestrator | Monday 20 April 2026 00:56:47 +0000 (0:00:00.791) 0:09:43.392 ********** 2026-04-20 00:57:41.207381 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207385 | orchestrator | changed: [testbed-node-3 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-04-20 00:57:41.207388 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207392 | orchestrator | changed: [testbed-node-4 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-04-20 00:57:41.207396 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207400 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2026-04-20 00:57:41.207403 | orchestrator | 2026-04-20 00:57:41.207407 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2026-04-20 00:57:41.207411 | orchestrator | Monday 20 April 2026 00:56:51 +0000 (0:00:04.361) 0:09:47.753 ********** 2026-04-20 00:57:41.207415 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207418 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:57:41.207422 | orchestrator | ok: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207426 | orchestrator | ok: [testbed-node-4 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:57:41.207430 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:57:41.207433 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:57:41.207437 | orchestrator | 2026-04-20 00:57:41.207441 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2026-04-20 00:57:41.207445 | orchestrator | Monday 20 April 2026 00:56:53 +0000 (0:00:02.212) 0:09:49.965 ********** 2026-04-20 00:57:41.207448 | orchestrator | changed: [testbed-node-3] => (item=None) 2026-04-20 00:57:41.207452 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.207456 | orchestrator | changed: [testbed-node-4] => (item=None) 2026-04-20 00:57:41.207459 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.207468 | orchestrator | changed: [testbed-node-5] => (item=None) 2026-04-20 00:57:41.207472 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.207476 | orchestrator | 2026-04-20 00:57:41.207480 | orchestrator | TASK [ceph-rgw : Rgw pool creation tasks] ************************************** 2026-04-20 00:57:41.207484 | orchestrator | Monday 20 April 2026 00:56:55 +0000 (0:00:01.463) 0:09:51.429 ********** 2026-04-20 00:57:41.207487 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/rgw_create_pools.yml for testbed-node-3 2026-04-20 00:57:41.207494 | orchestrator | 2026-04-20 00:57:41.207498 | orchestrator | TASK [ceph-rgw : Create ec profile] ******************************************** 2026-04-20 00:57:41.207501 | orchestrator | Monday 20 April 2026 00:56:55 +0000 (0:00:00.189) 0:09:51.619 ********** 2026-04-20 00:57:41.207505 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207509 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207513 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207517 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207521 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207525 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207528 | orchestrator | 2026-04-20 00:57:41.207532 | orchestrator | TASK [ceph-rgw : Set crush rule] *********************************************** 2026-04-20 00:57:41.207536 | orchestrator | Monday 20 April 2026 00:56:55 +0000 (0:00:00.511) 0:09:52.130 ********** 2026-04-20 00:57:41.207540 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207543 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207547 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207551 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207558 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2026-04-20 00:57:41.207562 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207566 | orchestrator | 2026-04-20 00:57:41.207569 | orchestrator | TASK [ceph-rgw : Create rgw pools] ********************************************* 2026-04-20 00:57:41.207573 | orchestrator | Monday 20 April 2026 00:56:56 +0000 (0:00:00.511) 0:09:52.642 ********** 2026-04-20 00:57:41.207577 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-20 00:57:41.207581 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-20 00:57:41.207585 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-20 00:57:41.207588 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-20 00:57:41.207592 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2026-04-20 00:57:41.207596 | orchestrator | 2026-04-20 00:57:41.207600 | orchestrator | TASK [ceph-rgw : Include_tasks openstack-keystone.yml] ************************* 2026-04-20 00:57:41.207604 | orchestrator | Monday 20 April 2026 00:57:26 +0000 (0:00:30.654) 0:10:23.296 ********** 2026-04-20 00:57:41.207607 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207611 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207615 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207619 | orchestrator | 2026-04-20 00:57:41.207626 | orchestrator | TASK [ceph-rgw : Include_tasks start_radosgw.yml] ****************************** 2026-04-20 00:57:41.207630 | orchestrator | Monday 20 April 2026 00:57:27 +0000 (0:00:00.282) 0:10:23.579 ********** 2026-04-20 00:57:41.207633 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207637 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207641 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207645 | orchestrator | 2026-04-20 00:57:41.207648 | orchestrator | TASK [ceph-rgw : Include start_docker_rgw.yml] ********************************* 2026-04-20 00:57:41.207652 | orchestrator | Monday 20 April 2026 00:57:27 +0000 (0:00:00.409) 0:10:23.988 ********** 2026-04-20 00:57:41.207656 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/start_docker_rgw.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.207660 | orchestrator | 2026-04-20 00:57:41.207663 | orchestrator | TASK [ceph-rgw : Include_task systemd.yml] ************************************* 2026-04-20 00:57:41.207670 | orchestrator | Monday 20 April 2026 00:57:28 +0000 (0:00:00.485) 0:10:24.473 ********** 2026-04-20 00:57:41.207674 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.207678 | orchestrator | 2026-04-20 00:57:41.207682 | orchestrator | TASK [ceph-rgw : Generate systemd unit file] *********************************** 2026-04-20 00:57:41.207685 | orchestrator | Monday 20 April 2026 00:57:28 +0000 (0:00:00.575) 0:10:25.049 ********** 2026-04-20 00:57:41.207689 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.207693 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.207696 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.207700 | orchestrator | 2026-04-20 00:57:41.207704 | orchestrator | TASK [ceph-rgw : Generate systemd ceph-radosgw target file] ******************** 2026-04-20 00:57:41.207707 | orchestrator | Monday 20 April 2026 00:57:29 +0000 (0:00:01.165) 0:10:26.214 ********** 2026-04-20 00:57:41.207711 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.207715 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.207719 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.207724 | orchestrator | 2026-04-20 00:57:41.207730 | orchestrator | TASK [ceph-rgw : Enable ceph-radosgw.target] *********************************** 2026-04-20 00:57:41.207735 | orchestrator | Monday 20 April 2026 00:57:30 +0000 (0:00:01.116) 0:10:27.330 ********** 2026-04-20 00:57:41.207741 | orchestrator | changed: [testbed-node-3] 2026-04-20 00:57:41.207747 | orchestrator | changed: [testbed-node-5] 2026-04-20 00:57:41.207753 | orchestrator | changed: [testbed-node-4] 2026-04-20 00:57:41.207758 | orchestrator | 2026-04-20 00:57:41.207764 | orchestrator | TASK [ceph-rgw : Systemd start rgw container] ********************************** 2026-04-20 00:57:41.207770 | orchestrator | Monday 20 April 2026 00:57:32 +0000 (0:00:01.924) 0:10:29.255 ********** 2026-04-20 00:57:41.207776 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.207781 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.207787 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2026-04-20 00:57:41.207792 | orchestrator | 2026-04-20 00:57:41.207798 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2026-04-20 00:57:41.207804 | orchestrator | Monday 20 April 2026 00:57:35 +0000 (0:00:02.543) 0:10:31.799 ********** 2026-04-20 00:57:41.207811 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207817 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207823 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207829 | orchestrator | 2026-04-20 00:57:41.207835 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2026-04-20 00:57:41.207845 | orchestrator | Monday 20 April 2026 00:57:35 +0000 (0:00:00.285) 0:10:32.084 ********** 2026-04-20 00:57:41.207852 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:57:41.207864 | orchestrator | 2026-04-20 00:57:41.207868 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2026-04-20 00:57:41.207872 | orchestrator | Monday 20 April 2026 00:57:36 +0000 (0:00:00.598) 0:10:32.683 ********** 2026-04-20 00:57:41.207876 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207880 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207883 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207887 | orchestrator | 2026-04-20 00:57:41.207891 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2026-04-20 00:57:41.207895 | orchestrator | Monday 20 April 2026 00:57:36 +0000 (0:00:00.273) 0:10:32.957 ********** 2026-04-20 00:57:41.207898 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207902 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:57:41.207906 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:57:41.207910 | orchestrator | 2026-04-20 00:57:41.207913 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2026-04-20 00:57:41.207917 | orchestrator | Monday 20 April 2026 00:57:36 +0000 (0:00:00.282) 0:10:33.239 ********** 2026-04-20 00:57:41.207921 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:57:41.207924 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:57:41.207928 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:57:41.207932 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:57:41.207936 | orchestrator | 2026-04-20 00:57:41.207939 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2026-04-20 00:57:41.207943 | orchestrator | Monday 20 April 2026 00:57:37 +0000 (0:00:00.711) 0:10:33.951 ********** 2026-04-20 00:57:41.207947 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:57:41.207950 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:57:41.207954 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:57:41.207958 | orchestrator | 2026-04-20 00:57:41.207962 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:57:41.207965 | orchestrator | testbed-node-0 : ok=134  changed=35  unreachable=0 failed=0 skipped=125  rescued=0 ignored=0 2026-04-20 00:57:41.207970 | orchestrator | testbed-node-1 : ok=127  changed=31  unreachable=0 failed=0 skipped=120  rescued=0 ignored=0 2026-04-20 00:57:41.207973 | orchestrator | testbed-node-2 : ok=134  changed=33  unreachable=0 failed=0 skipped=119  rescued=0 ignored=0 2026-04-20 00:57:41.207977 | orchestrator | testbed-node-3 : ok=193  changed=45  unreachable=0 failed=0 skipped=162  rescued=0 ignored=0 2026-04-20 00:57:41.207985 | orchestrator | testbed-node-4 : ok=175  changed=40  unreachable=0 failed=0 skipped=123  rescued=0 ignored=0 2026-04-20 00:57:41.207988 | orchestrator | testbed-node-5 : ok=177  changed=41  unreachable=0 failed=0 skipped=121  rescued=0 ignored=0 2026-04-20 00:57:41.207992 | orchestrator | 2026-04-20 00:57:41.207996 | orchestrator | 2026-04-20 00:57:41.208000 | orchestrator | 2026-04-20 00:57:41.208004 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:57:41.208007 | orchestrator | Monday 20 April 2026 00:57:37 +0000 (0:00:00.386) 0:10:34.337 ********** 2026-04-20 00:57:41.208011 | orchestrator | =============================================================================== 2026-04-20 00:57:41.208015 | orchestrator | ceph-container-common : Pulling Ceph container image ------------------- 44.96s 2026-04-20 00:57:41.208019 | orchestrator | ceph-osd : Use ceph-volume to create osds ------------------------------ 42.94s 2026-04-20 00:57:41.208022 | orchestrator | ceph-mgr : Wait for all mgr to be up ----------------------------------- 36.64s 2026-04-20 00:57:41.208026 | orchestrator | ceph-rgw : Create rgw pools -------------------------------------------- 30.65s 2026-04-20 00:57:41.208034 | orchestrator | ceph-mon : Waiting for the monitor(s) to form the quorum... ------------ 22.05s 2026-04-20 00:57:41.208038 | orchestrator | ceph-mon : Set cluster configs ----------------------------------------- 14.42s 2026-04-20 00:57:41.208042 | orchestrator | ceph-osd : Wait for all osd to be up ----------------------------------- 12.63s 2026-04-20 00:57:41.208045 | orchestrator | ceph-mgr : Create ceph mgr keyring(s) on a mon node --------------------- 9.44s 2026-04-20 00:57:41.208049 | orchestrator | ceph-mon : Fetch ceph initial keys -------------------------------------- 9.11s 2026-04-20 00:57:41.208053 | orchestrator | ceph-config : Create ceph initial directories --------------------------- 7.05s 2026-04-20 00:57:41.208057 | orchestrator | ceph-mgr : Disable ceph mgr enabled modules ----------------------------- 6.45s 2026-04-20 00:57:41.208060 | orchestrator | ceph-mds : Create filesystem pools -------------------------------------- 6.27s 2026-04-20 00:57:41.208064 | orchestrator | ceph-mgr : Add modules to ceph-mgr -------------------------------------- 4.67s 2026-04-20 00:57:41.208068 | orchestrator | ceph-rgw : Create rgw keyrings ------------------------------------------ 4.36s 2026-04-20 00:57:41.208071 | orchestrator | ceph-mds : Create ceph filesystem --------------------------------------- 3.74s 2026-04-20 00:57:41.208075 | orchestrator | ceph-osd : Systemd start osd -------------------------------------------- 3.53s 2026-04-20 00:57:41.208079 | orchestrator | ceph-mon : Copy admin keyring over to mons ------------------------------ 3.45s 2026-04-20 00:57:41.208082 | orchestrator | ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created --- 3.45s 2026-04-20 00:57:41.208088 | orchestrator | ceph-crash : Start the ceph-crash service ------------------------------- 3.41s 2026-04-20 00:57:41.208092 | orchestrator | ceph-facts : Set_fact ceph_admin_command -------------------------------- 3.40s 2026-04-20 00:57:41.208096 | orchestrator | 2026-04-20 00:57:41 | INFO  | Task 311c84ec-0b3d-44d1-9a16-6943e16ebba1 is in state SUCCESS 2026-04-20 00:57:41.208100 | orchestrator | 2026-04-20 00:57:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:44.233038 | orchestrator | 2026-04-20 00:57:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:44.235341 | orchestrator | 2026-04-20 00:57:44 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:44.235420 | orchestrator | 2026-04-20 00:57:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:47.269831 | orchestrator | 2026-04-20 00:57:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:47.271561 | orchestrator | 2026-04-20 00:57:47 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:47.271599 | orchestrator | 2026-04-20 00:57:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:50.308317 | orchestrator | 2026-04-20 00:57:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:50.308376 | orchestrator | 2026-04-20 00:57:50 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:50.308382 | orchestrator | 2026-04-20 00:57:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:53.343924 | orchestrator | 2026-04-20 00:57:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:53.345480 | orchestrator | 2026-04-20 00:57:53 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:53.345525 | orchestrator | 2026-04-20 00:57:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:56.383854 | orchestrator | 2026-04-20 00:57:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:56.384325 | orchestrator | 2026-04-20 00:57:56 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:56.384348 | orchestrator | 2026-04-20 00:57:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:57:59.426860 | orchestrator | 2026-04-20 00:57:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:57:59.429008 | orchestrator | 2026-04-20 00:57:59 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:57:59.429077 | orchestrator | 2026-04-20 00:57:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:02.469752 | orchestrator | 2026-04-20 00:58:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:02.469901 | orchestrator | 2026-04-20 00:58:02 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:02.469909 | orchestrator | 2026-04-20 00:58:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:05.509954 | orchestrator | 2026-04-20 00:58:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:05.511409 | orchestrator | 2026-04-20 00:58:05 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:05.511457 | orchestrator | 2026-04-20 00:58:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:08.559577 | orchestrator | 2026-04-20 00:58:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:08.561736 | orchestrator | 2026-04-20 00:58:08 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:08.561815 | orchestrator | 2026-04-20 00:58:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:11.596916 | orchestrator | 2026-04-20 00:58:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:11.598465 | orchestrator | 2026-04-20 00:58:11 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:11.598522 | orchestrator | 2026-04-20 00:58:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:14.639093 | orchestrator | 2026-04-20 00:58:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:14.640731 | orchestrator | 2026-04-20 00:58:14 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:14.640787 | orchestrator | 2026-04-20 00:58:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:17.680425 | orchestrator | 2026-04-20 00:58:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:17.682210 | orchestrator | 2026-04-20 00:58:17 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:17.682260 | orchestrator | 2026-04-20 00:58:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:20.722163 | orchestrator | 2026-04-20 00:58:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:20.723782 | orchestrator | 2026-04-20 00:58:20 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:20.723851 | orchestrator | 2026-04-20 00:58:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:23.765914 | orchestrator | 2026-04-20 00:58:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:23.767963 | orchestrator | 2026-04-20 00:58:23 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:23.768027 | orchestrator | 2026-04-20 00:58:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:26.809501 | orchestrator | 2026-04-20 00:58:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:26.811286 | orchestrator | 2026-04-20 00:58:26 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:26.811397 | orchestrator | 2026-04-20 00:58:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:29.851699 | orchestrator | 2026-04-20 00:58:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:29.853028 | orchestrator | 2026-04-20 00:58:29 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:29.853214 | orchestrator | 2026-04-20 00:58:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:32.894442 | orchestrator | 2026-04-20 00:58:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:32.896630 | orchestrator | 2026-04-20 00:58:32 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:32.896741 | orchestrator | 2026-04-20 00:58:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:35.941893 | orchestrator | 2026-04-20 00:58:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:35.943371 | orchestrator | 2026-04-20 00:58:35 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:35.943421 | orchestrator | 2026-04-20 00:58:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:38.988794 | orchestrator | 2026-04-20 00:58:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:38.990282 | orchestrator | 2026-04-20 00:58:38 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:38.990492 | orchestrator | 2026-04-20 00:58:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:42.036608 | orchestrator | 2026-04-20 00:58:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:42.037529 | orchestrator | 2026-04-20 00:58:42 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:42.037580 | orchestrator | 2026-04-20 00:58:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:45.073355 | orchestrator | 2026-04-20 00:58:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:45.073793 | orchestrator | 2026-04-20 00:58:45 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:45.073831 | orchestrator | 2026-04-20 00:58:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:48.118119 | orchestrator | 2026-04-20 00:58:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:48.119611 | orchestrator | 2026-04-20 00:58:48 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:48.119684 | orchestrator | 2026-04-20 00:58:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:51.156981 | orchestrator | 2026-04-20 00:58:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:51.158430 | orchestrator | 2026-04-20 00:58:51 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:51.158475 | orchestrator | 2026-04-20 00:58:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:54.209323 | orchestrator | 2026-04-20 00:58:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:54.213199 | orchestrator | 2026-04-20 00:58:54 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:54.213270 | orchestrator | 2026-04-20 00:58:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:58:57.262462 | orchestrator | 2026-04-20 00:58:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:58:57.264497 | orchestrator | 2026-04-20 00:58:57 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:58:57.264540 | orchestrator | 2026-04-20 00:58:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:00.299550 | orchestrator | 2026-04-20 00:59:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:00.300335 | orchestrator | 2026-04-20 00:59:00 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:00.300392 | orchestrator | 2026-04-20 00:59:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:03.343123 | orchestrator | 2026-04-20 00:59:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:03.344492 | orchestrator | 2026-04-20 00:59:03 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:03.344542 | orchestrator | 2026-04-20 00:59:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:06.390886 | orchestrator | 2026-04-20 00:59:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:06.392935 | orchestrator | 2026-04-20 00:59:06 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:06.393000 | orchestrator | 2026-04-20 00:59:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:09.439667 | orchestrator | 2026-04-20 00:59:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:09.440989 | orchestrator | 2026-04-20 00:59:09 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:09.441142 | orchestrator | 2026-04-20 00:59:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:12.480585 | orchestrator | 2026-04-20 00:59:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:12.481410 | orchestrator | 2026-04-20 00:59:12 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:12.481445 | orchestrator | 2026-04-20 00:59:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:15.522439 | orchestrator | 2026-04-20 00:59:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:15.524443 | orchestrator | 2026-04-20 00:59:15 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:15.524708 | orchestrator | 2026-04-20 00:59:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:18.563222 | orchestrator | 2026-04-20 00:59:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:18.565662 | orchestrator | 2026-04-20 00:59:18 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:18.565738 | orchestrator | 2026-04-20 00:59:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:21.602923 | orchestrator | 2026-04-20 00:59:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:21.603976 | orchestrator | 2026-04-20 00:59:21 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:21.604000 | orchestrator | 2026-04-20 00:59:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:24.648498 | orchestrator | 2026-04-20 00:59:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:24.649540 | orchestrator | 2026-04-20 00:59:24 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:24.649600 | orchestrator | 2026-04-20 00:59:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:27.691949 | orchestrator | 2026-04-20 00:59:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:27.693137 | orchestrator | 2026-04-20 00:59:27 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:27.693188 | orchestrator | 2026-04-20 00:59:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:30.736253 | orchestrator | 2026-04-20 00:59:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:30.737650 | orchestrator | 2026-04-20 00:59:30 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:30.737720 | orchestrator | 2026-04-20 00:59:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:33.779696 | orchestrator | 2026-04-20 00:59:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:33.781239 | orchestrator | 2026-04-20 00:59:33 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:33.781309 | orchestrator | 2026-04-20 00:59:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:36.821874 | orchestrator | 2026-04-20 00:59:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:36.824386 | orchestrator | 2026-04-20 00:59:36 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:36.824452 | orchestrator | 2026-04-20 00:59:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:39.866446 | orchestrator | 2026-04-20 00:59:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:39.867645 | orchestrator | 2026-04-20 00:59:39 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:39.867789 | orchestrator | 2026-04-20 00:59:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:42.903610 | orchestrator | 2026-04-20 00:59:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:42.904864 | orchestrator | 2026-04-20 00:59:42 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state STARTED 2026-04-20 00:59:42.904892 | orchestrator | 2026-04-20 00:59:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:45.951200 | orchestrator | 2026-04-20 00:59:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:45.952913 | orchestrator | 2026-04-20 00:59:45 | INFO  | Task 6fb1a72a-3ae9-4a52-a0d7-88749d9cdc04 is in state SUCCESS 2026-04-20 00:59:45.955214 | orchestrator | 2026-04-20 00:59:45.955257 | orchestrator | 2026-04-20 00:59:45.955267 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 00:59:45.955275 | orchestrator | 2026-04-20 00:59:45.955282 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 00:59:45.955290 | orchestrator | Monday 20 April 2026 00:56:44 +0000 (0:00:00.278) 0:00:00.278 ********** 2026-04-20 00:59:45.955297 | orchestrator | ok: [testbed-node-0] 2026-04-20 00:59:45.955305 | orchestrator | ok: [testbed-node-1] 2026-04-20 00:59:45.955312 | orchestrator | ok: [testbed-node-2] 2026-04-20 00:59:45.955319 | orchestrator | 2026-04-20 00:59:45.955334 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 00:59:45.955341 | orchestrator | Monday 20 April 2026 00:56:45 +0000 (0:00:00.260) 0:00:00.539 ********** 2026-04-20 00:59:45.955348 | orchestrator | ok: [testbed-node-0] => (item=enable_magnum_True) 2026-04-20 00:59:45.955354 | orchestrator | ok: [testbed-node-1] => (item=enable_magnum_True) 2026-04-20 00:59:45.955360 | orchestrator | ok: [testbed-node-2] => (item=enable_magnum_True) 2026-04-20 00:59:45.955366 | orchestrator | 2026-04-20 00:59:45.955372 | orchestrator | PLAY [Apply role magnum] ******************************************************* 2026-04-20 00:59:45.955391 | orchestrator | 2026-04-20 00:59:45.955398 | orchestrator | TASK [magnum : include_tasks] ************************************************** 2026-04-20 00:59:45.955456 | orchestrator | Monday 20 April 2026 00:56:45 +0000 (0:00:00.315) 0:00:00.855 ********** 2026-04-20 00:59:45.955462 | orchestrator | included: /ansible/roles/magnum/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 00:59:45.955470 | orchestrator | 2026-04-20 00:59:45.955475 | orchestrator | TASK [service-ks-register : magnum | Creating/deleting services] *************** 2026-04-20 00:59:45.955482 | orchestrator | Monday 20 April 2026 00:56:46 +0000 (0:00:00.591) 0:00:01.447 ********** 2026-04-20 00:59:45.955488 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (5 retries left). 2026-04-20 00:59:45.955494 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (4 retries left). 2026-04-20 00:59:45.955499 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (3 retries left). 2026-04-20 00:59:45.955505 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (2 retries left). 2026-04-20 00:59:45.955510 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating/deleting services (1 retries left). 2026-04-20 00:59:45.955518 | orchestrator | failed: [testbed-node-0] (item=magnum (container-infra)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Container Infrastructure Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9511/v1"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9511/v1"}], "name": "magnum", "type": "container-infra"}, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 00:59:45.955526 | orchestrator | 2026-04-20 00:59:45.955531 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:59:45.955537 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2026-04-20 00:59:45.955543 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:59:45.955550 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 00:59:45.955556 | orchestrator | 2026-04-20 00:59:45.955562 | orchestrator | 2026-04-20 00:59:45.955567 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:59:45.955573 | orchestrator | Monday 20 April 2026 00:57:39 +0000 (0:00:53.480) 0:00:54.927 ********** 2026-04-20 00:59:45.955579 | orchestrator | =============================================================================== 2026-04-20 00:59:45.955585 | orchestrator | service-ks-register : magnum | Creating/deleting services -------------- 53.48s 2026-04-20 00:59:45.955590 | orchestrator | magnum : include_tasks -------------------------------------------------- 0.59s 2026-04-20 00:59:45.955595 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.32s 2026-04-20 00:59:45.955601 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.26s 2026-04-20 00:59:45.955607 | orchestrator | 2026-04-20 00:59:45.955613 | orchestrator | [WARNING]: Collection community.general does not support Ansible version 2026-04-20 00:59:45.955619 | orchestrator | 2.16.14 2026-04-20 00:59:45.955626 | orchestrator | 2026-04-20 00:59:45.955632 | orchestrator | PLAY [Create ceph pools] ******************************************************* 2026-04-20 00:59:45.955638 | orchestrator | 2026-04-20 00:59:45.955645 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2026-04-20 00:59:45.955651 | orchestrator | Monday 20 April 2026 00:57:42 +0000 (0:00:00.604) 0:00:00.604 ********** 2026-04-20 00:59:45.955658 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:59:45.955671 | orchestrator | 2026-04-20 00:59:45.955678 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2026-04-20 00:59:45.955684 | orchestrator | Monday 20 April 2026 00:57:43 +0000 (0:00:00.658) 0:00:01.263 ********** 2026-04-20 00:59:45.955690 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.955697 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.955703 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.955709 | orchestrator | 2026-04-20 00:59:45.955715 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2026-04-20 00:59:45.955732 | orchestrator | Monday 20 April 2026 00:57:44 +0000 (0:00:01.106) 0:00:02.370 ********** 2026-04-20 00:59:45.955739 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.955745 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.955752 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.955758 | orchestrator | 2026-04-20 00:59:45.955765 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2026-04-20 00:59:45.955771 | orchestrator | Monday 20 April 2026 00:57:44 +0000 (0:00:00.255) 0:00:02.625 ********** 2026-04-20 00:59:45.955778 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.955788 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.955795 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.955802 | orchestrator | 2026-04-20 00:59:45.955808 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2026-04-20 00:59:45.955815 | orchestrator | Monday 20 April 2026 00:57:45 +0000 (0:00:00.709) 0:00:03.334 ********** 2026-04-20 00:59:45.955822 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.955829 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.955836 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.955843 | orchestrator | 2026-04-20 00:59:45.955851 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2026-04-20 00:59:45.956127 | orchestrator | Monday 20 April 2026 00:57:45 +0000 (0:00:00.292) 0:00:03.627 ********** 2026-04-20 00:59:45.956141 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.956148 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.956159 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.956167 | orchestrator | 2026-04-20 00:59:45.956174 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2026-04-20 00:59:45.956182 | orchestrator | Monday 20 April 2026 00:57:45 +0000 (0:00:00.246) 0:00:03.874 ********** 2026-04-20 00:59:45.956191 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.956198 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.956205 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.956211 | orchestrator | 2026-04-20 00:59:45.956218 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2026-04-20 00:59:45.956225 | orchestrator | Monday 20 April 2026 00:57:46 +0000 (0:00:00.271) 0:00:04.145 ********** 2026-04-20 00:59:45.956232 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.956240 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.956246 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.956253 | orchestrator | 2026-04-20 00:59:45.956260 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2026-04-20 00:59:45.956267 | orchestrator | Monday 20 April 2026 00:57:46 +0000 (0:00:00.372) 0:00:04.518 ********** 2026-04-20 00:59:45.956274 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.956281 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.956287 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.956294 | orchestrator | 2026-04-20 00:59:45.956302 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2026-04-20 00:59:45.956308 | orchestrator | Monday 20 April 2026 00:57:46 +0000 (0:00:00.264) 0:00:04.782 ********** 2026-04-20 00:59:45.956315 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:59:45.956323 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:59:45.956330 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:59:45.956345 | orchestrator | 2026-04-20 00:59:45.956352 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2026-04-20 00:59:45.956359 | orchestrator | Monday 20 April 2026 00:57:47 +0000 (0:00:00.567) 0:00:05.349 ********** 2026-04-20 00:59:45.956367 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.956374 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.956381 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.956388 | orchestrator | 2026-04-20 00:59:45.956395 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2026-04-20 00:59:45.956416 | orchestrator | Monday 20 April 2026 00:57:47 +0000 (0:00:00.350) 0:00:05.700 ********** 2026-04-20 00:59:45.956422 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:59:45.956428 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:59:45.956435 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:59:45.956441 | orchestrator | 2026-04-20 00:59:45.956448 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2026-04-20 00:59:45.956454 | orchestrator | Monday 20 April 2026 00:57:50 +0000 (0:00:02.657) 0:00:08.358 ********** 2026-04-20 00:59:45.956461 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-20 00:59:45.956468 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-20 00:59:45.956474 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-20 00:59:45.956481 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.956487 | orchestrator | 2026-04-20 00:59:45.956493 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2026-04-20 00:59:45.956500 | orchestrator | Monday 20 April 2026 00:57:50 +0000 (0:00:00.354) 0:00:08.712 ********** 2026-04-20 00:59:45.956508 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.956517 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.956530 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.956537 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.956544 | orchestrator | 2026-04-20 00:59:45.956551 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2026-04-20 00:59:45.956558 | orchestrator | Monday 20 April 2026 00:57:51 +0000 (0:00:00.653) 0:00:09.365 ********** 2026-04-20 00:59:45.956571 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.956579 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.956586 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.956600 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.956606 | orchestrator | 2026-04-20 00:59:45.956613 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2026-04-20 00:59:45.956620 | orchestrator | Monday 20 April 2026 00:57:51 +0000 (0:00:00.131) 0:00:09.497 ********** 2026-04-20 00:59:45.956627 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '3dd67966b57d', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2026-04-20 00:57:48.507936', 'end': '2026-04-20 00:57:48.531531', 'delta': '0:00:00.023595', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['3dd67966b57d'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}) 2026-04-20 00:59:45.956637 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '9144ced3bbb3', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2026-04-20 00:57:49.403298', 'end': '2026-04-20 00:57:49.429249', 'delta': '0:00:00.025951', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['9144ced3bbb3'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}) 2026-04-20 00:59:45.956645 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': 'af9ec5200c8d', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2026-04-20 00:57:50.150872', 'end': '2026-04-20 00:57:50.176349', 'delta': '0:00:00.025477', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['af9ec5200c8d'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}) 2026-04-20 00:59:45.956651 | orchestrator | 2026-04-20 00:59:45.956658 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2026-04-20 00:59:45.956685 | orchestrator | Monday 20 April 2026 00:57:51 +0000 (0:00:00.288) 0:00:09.785 ********** 2026-04-20 00:59:45.956692 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.956698 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.956704 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.956711 | orchestrator | 2026-04-20 00:59:45.956717 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2026-04-20 00:59:45.956722 | orchestrator | Monday 20 April 2026 00:57:52 +0000 (0:00:00.373) 0:00:10.159 ********** 2026-04-20 00:59:45.956732 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] 2026-04-20 00:59:45.956995 | orchestrator | 2026-04-20 00:59:45.957010 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2026-04-20 00:59:45.957018 | orchestrator | Monday 20 April 2026 00:57:53 +0000 (0:00:01.506) 0:00:11.665 ********** 2026-04-20 00:59:45.957024 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957031 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957045 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957052 | orchestrator | 2026-04-20 00:59:45.957059 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2026-04-20 00:59:45.957066 | orchestrator | Monday 20 April 2026 00:57:53 +0000 (0:00:00.245) 0:00:11.910 ********** 2026-04-20 00:59:45.957073 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957080 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957087 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957093 | orchestrator | 2026-04-20 00:59:45.957101 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-20 00:59:45.957108 | orchestrator | Monday 20 April 2026 00:57:54 +0000 (0:00:00.357) 0:00:12.267 ********** 2026-04-20 00:59:45.957115 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957122 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957129 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957136 | orchestrator | 2026-04-20 00:59:45.957144 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2026-04-20 00:59:45.957151 | orchestrator | Monday 20 April 2026 00:57:54 +0000 (0:00:00.354) 0:00:12.622 ********** 2026-04-20 00:59:45.957158 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.957166 | orchestrator | 2026-04-20 00:59:45.957173 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2026-04-20 00:59:45.957180 | orchestrator | Monday 20 April 2026 00:57:54 +0000 (0:00:00.121) 0:00:12.744 ********** 2026-04-20 00:59:45.957186 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957193 | orchestrator | 2026-04-20 00:59:45.957200 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2026-04-20 00:59:45.957208 | orchestrator | Monday 20 April 2026 00:57:54 +0000 (0:00:00.192) 0:00:12.936 ********** 2026-04-20 00:59:45.957215 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957223 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957230 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957245 | orchestrator | 2026-04-20 00:59:45.957252 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2026-04-20 00:59:45.957259 | orchestrator | Monday 20 April 2026 00:57:55 +0000 (0:00:00.242) 0:00:13.178 ********** 2026-04-20 00:59:45.957265 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957272 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957278 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957298 | orchestrator | 2026-04-20 00:59:45.957305 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2026-04-20 00:59:45.957311 | orchestrator | Monday 20 April 2026 00:57:55 +0000 (0:00:00.272) 0:00:13.451 ********** 2026-04-20 00:59:45.957318 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957324 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957330 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957336 | orchestrator | 2026-04-20 00:59:45.957343 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2026-04-20 00:59:45.957349 | orchestrator | Monday 20 April 2026 00:57:55 +0000 (0:00:00.389) 0:00:13.841 ********** 2026-04-20 00:59:45.957356 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957363 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957369 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957376 | orchestrator | 2026-04-20 00:59:45.957382 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2026-04-20 00:59:45.957389 | orchestrator | Monday 20 April 2026 00:57:56 +0000 (0:00:00.278) 0:00:14.119 ********** 2026-04-20 00:59:45.957395 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957440 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957448 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957454 | orchestrator | 2026-04-20 00:59:45.957461 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2026-04-20 00:59:45.957468 | orchestrator | Monday 20 April 2026 00:57:56 +0000 (0:00:00.272) 0:00:14.392 ********** 2026-04-20 00:59:45.957481 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957487 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957493 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957499 | orchestrator | 2026-04-20 00:59:45.957506 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2026-04-20 00:59:45.957513 | orchestrator | Monday 20 April 2026 00:57:56 +0000 (0:00:00.266) 0:00:14.658 ********** 2026-04-20 00:59:45.957519 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957526 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.957533 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.957540 | orchestrator | 2026-04-20 00:59:45.957546 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2026-04-20 00:59:45.957553 | orchestrator | Monday 20 April 2026 00:57:56 +0000 (0:00:00.402) 0:00:15.061 ********** 2026-04-20 00:59:45.957593 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61', 'dm-uuid-LVM-IfSfGszKHiaKTTNI02Uf8MUQQ4OjiUWfiT4sKZQwvBWbStfgT02J1cBvwS0hJ5Ri'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957608 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c', 'dm-uuid-LVM-wqhPAmbCBa1EDLvJgbIeCVRnb7e8vUllW3dSX5UPeygNS001DcPoOZ2IxMimNea1'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957616 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957624 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957632 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957639 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957646 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957659 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957666 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957693 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957705 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957713 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zrnUPj-E0xj-u6GZ-IZ7t-BSHz-exTY-3U5YEc', 'scsi-0QEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11', 'scsi-SQEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957726 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zv4eBP-1KQu-zoc8-4Ks7-3EPc-TEoE-YDwG49', 'scsi-0QEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11', 'scsi-SQEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957755 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9', 'dm-uuid-LVM-lw9HUO9cNePWfT2Pexxh0s2cnlz7QYqvJilk0DjsdoTcFqGmlXOynjtyLIMbLNxQ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957769 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39', 'scsi-SQEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957777 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561', 'dm-uuid-LVM-t2zFV9phmXPkHxXhobyelLxvV4hrZYVXxR4ps4MaDQZgEsoKDmVcziC3DbY6S7qJ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957784 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-47-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957791 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957803 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.957810 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957817 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957823 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957848 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957858 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58', 'dm-uuid-LVM-oiEOoD5dVCLksAjcYkcQxq07ayCngSR6v2bKe1AtEK1XlhE9dhfgguA3x9voHqOX'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957865 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957872 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6', 'dm-uuid-LVM-X9Z1iQEwwD1G0QlSKwYXR1ueod4K8eqpghyiucE34SecLfgjufAMbWW75vBvaWlf'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957878 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957889 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957896 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957903 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957909 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957924 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957936 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957943 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-JsgzGM-E3nq-gCWf-7fT4-ajsm-VhbE-E5ih4T', 'scsi-0QEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c', 'scsi-SQEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957949 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957961 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-PxO9K9-NxBp-CF9P-CZ2U-Mr2F-2HvG-ZOs2Yf', 'scsi-0QEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e', 'scsi-SQEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957970 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957978 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435', 'scsi-SQEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.957985 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.957996 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.958003 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.958010 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2026-04-20 00:59:45.958056 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.958066 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-yvipd2-ylGY-cevr-TOS1-fWSQ-K3IX-2V7x97', 'scsi-0QEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560', 'scsi-SQEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.958078 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-J8WRl9-vfy9-xFuV-yNo1-3fdp-WX3V-1XW9PF', 'scsi-0QEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1', 'scsi-SQEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.958086 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4', 'scsi-SQEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.958094 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-37-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2026-04-20 00:59:45.958101 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.958109 | orchestrator | 2026-04-20 00:59:45.958116 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2026-04-20 00:59:45.958124 | orchestrator | Monday 20 April 2026 00:57:57 +0000 (0:00:00.448) 0:00:15.509 ********** 2026-04-20 00:59:45.958138 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61', 'dm-uuid-LVM-IfSfGszKHiaKTTNI02Uf8MUQQ4OjiUWfiT4sKZQwvBWbStfgT02J1cBvwS0hJ5Ri'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958147 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c', 'dm-uuid-LVM-wqhPAmbCBa1EDLvJgbIeCVRnb7e8vUllW3dSX5UPeygNS001DcPoOZ2IxMimNea1'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958155 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958167 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958175 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958183 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958195 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958205 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958213 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958225 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9', 'dm-uuid-LVM-lw9HUO9cNePWfT2Pexxh0s2cnlz7QYqvJilk0DjsdoTcFqGmlXOynjtyLIMbLNxQ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958232 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958240 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561', 'dm-uuid-LVM-t2zFV9phmXPkHxXhobyelLxvV4hrZYVXxR4ps4MaDQZgEsoKDmVcziC3DbY6S7qJ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958256 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a9991d5-8e83-4951-b0c2-d6541434356e-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958271 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958280 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--4264b90b--a777--529d--80cd--078215cd7b61-osd--block--4264b90b--a777--529d--80cd--078215cd7b61'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zrnUPj-E0xj-u6GZ-IZ7t-BSHz-exTY-3U5YEc', 'scsi-0QEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11', 'scsi-SQEMU_QEMU_HARDDISK_71e5e2fe-8079-44a9-83c9-718c1a37ec11'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958291 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--0c7195b4--6e55--5dce--81dc--250aafa1626c-osd--block--0c7195b4--6e55--5dce--81dc--250aafa1626c'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zv4eBP-1KQu-zoc8-4Ks7-3EPc-TEoE-YDwG49', 'scsi-0QEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11', 'scsi-SQEMU_QEMU_HARDDISK_0c844390-ddcc-47db-87c2-e0ad3f299f11'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958302 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958313 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39', 'scsi-SQEMU_QEMU_HARDDISK_4d9b431e-9b52-486b-bddb-3e9e0ee5fa39'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958320 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958328 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-47-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958335 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958342 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.958352 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958363 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958375 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958383 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958391 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58', 'dm-uuid-LVM-oiEOoD5dVCLksAjcYkcQxq07ayCngSR6v2bKe1AtEK1XlhE9dhfgguA3x9voHqOX'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958420 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16', 'scsi-SQEMU_QEMU_HARDDISK_8a57eed1-8d6f-4860-8edf-ab5651bf3501-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958433 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6', 'dm-uuid-LVM-X9Z1iQEwwD1G0QlSKwYXR1ueod4K8eqpghyiucE34SecLfgjufAMbWW75vBvaWlf'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958441 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--7b8b741f--ff85--57a0--9457--c04aa474e6a9-osd--block--7b8b741f--ff85--57a0--9457--c04aa474e6a9'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-JsgzGM-E3nq-gCWf-7fT4-ajsm-VhbE-E5ih4T', 'scsi-0QEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c', 'scsi-SQEMU_QEMU_HARDDISK_6f84c887-ba73-482f-a41f-d5b1a59c2e3c'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958448 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958458 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--a3c07e85--95b7--5759--bf4d--00aad97d3561-osd--block--a3c07e85--95b7--5759--bf4d--00aad97d3561'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-PxO9K9-NxBp-CF9P-CZ2U-Mr2F-2HvG-ZOs2Yf', 'scsi-0QEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e', 'scsi-SQEMU_QEMU_HARDDISK_9b7f1cab-7403-4991-80fd-9e18e6faf85e'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958468 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958479 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435', 'scsi-SQEMU_QEMU_HARDDISK_0604a395-fc8c-4060-a9f6-9fb568501435'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958486 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958494 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-26-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958501 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.958508 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958519 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958528 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958539 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958546 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958557 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16', 'scsi-SQEMU_QEMU_HARDDISK_febc5b66-3851-48e5-b18a-64e71ac34203-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958571 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--f2b53557--bc93--5e7c--9922--524bc90e2f58-osd--block--f2b53557--bc93--5e7c--9922--524bc90e2f58'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-yvipd2-ylGY-cevr-TOS1-fWSQ-K3IX-2V7x97', 'scsi-0QEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560', 'scsi-SQEMU_QEMU_HARDDISK_bdcbd50e-fc40-4173-bc88-351fd741a560'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958579 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--575cdf11--a3b3--50b3--a6b0--c04d40287ec6-osd--block--575cdf11--a3b3--50b3--a6b0--c04d40287ec6'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-J8WRl9-vfy9-xFuV-yNo1-3fdp-WX3V-1XW9PF', 'scsi-0QEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1', 'scsi-SQEMU_QEMU_HARDDISK_bb585aa1-11e8-43ef-a761-9431875b84d1'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958586 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4', 'scsi-SQEMU_QEMU_HARDDISK_6895d0f2-ba69-41e1-a4cc-d0f527389fe4'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958593 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2026-04-20-00-03-37-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2026-04-20 00:59:45.958599 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.958606 | orchestrator | 2026-04-20 00:59:45.958613 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2026-04-20 00:59:45.958620 | orchestrator | Monday 20 April 2026 00:57:57 +0000 (0:00:00.506) 0:00:16.016 ********** 2026-04-20 00:59:45.958626 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.958637 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.958644 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.958650 | orchestrator | 2026-04-20 00:59:45.958657 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2026-04-20 00:59:45.958667 | orchestrator | Monday 20 April 2026 00:57:58 +0000 (0:00:00.705) 0:00:16.722 ********** 2026-04-20 00:59:45.958674 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.958680 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.958687 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.958693 | orchestrator | 2026-04-20 00:59:45.958700 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-20 00:59:45.958706 | orchestrator | Monday 20 April 2026 00:57:59 +0000 (0:00:00.398) 0:00:17.120 ********** 2026-04-20 00:59:45.958712 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.958718 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.958728 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.958734 | orchestrator | 2026-04-20 00:59:45.958741 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-20 00:59:45.958747 | orchestrator | Monday 20 April 2026 00:58:00 +0000 (0:00:01.689) 0:00:18.809 ********** 2026-04-20 00:59:45.958753 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.958759 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.958766 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.958772 | orchestrator | 2026-04-20 00:59:45.958779 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2026-04-20 00:59:45.958785 | orchestrator | Monday 20 April 2026 00:58:01 +0000 (0:00:00.266) 0:00:19.076 ********** 2026-04-20 00:59:45.958791 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.958798 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.958804 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.958811 | orchestrator | 2026-04-20 00:59:45.958817 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2026-04-20 00:59:45.958823 | orchestrator | Monday 20 April 2026 00:58:01 +0000 (0:00:00.344) 0:00:19.421 ********** 2026-04-20 00:59:45.958829 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.958836 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.958842 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.958848 | orchestrator | 2026-04-20 00:59:45.958855 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2026-04-20 00:59:45.958860 | orchestrator | Monday 20 April 2026 00:58:01 +0000 (0:00:00.415) 0:00:19.836 ********** 2026-04-20 00:59:45.958866 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2026-04-20 00:59:45.958872 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2026-04-20 00:59:45.958878 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2026-04-20 00:59:45.958885 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2026-04-20 00:59:45.958891 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2026-04-20 00:59:45.958897 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2026-04-20 00:59:45.958903 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2026-04-20 00:59:45.958909 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2026-04-20 00:59:45.958915 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2026-04-20 00:59:45.958921 | orchestrator | 2026-04-20 00:59:45.958928 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2026-04-20 00:59:45.958934 | orchestrator | Monday 20 April 2026 00:58:02 +0000 (0:00:00.721) 0:00:20.558 ********** 2026-04-20 00:59:45.958940 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2026-04-20 00:59:45.958947 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2026-04-20 00:59:45.958953 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2026-04-20 00:59:45.958959 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.958964 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2026-04-20 00:59:45.958970 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2026-04-20 00:59:45.958983 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2026-04-20 00:59:45.958990 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.958996 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2026-04-20 00:59:45.959003 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2026-04-20 00:59:45.959009 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2026-04-20 00:59:45.959016 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.959022 | orchestrator | 2026-04-20 00:59:45.959029 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2026-04-20 00:59:45.959036 | orchestrator | Monday 20 April 2026 00:58:02 +0000 (0:00:00.283) 0:00:20.842 ********** 2026-04-20 00:59:45.959042 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 00:59:45.959049 | orchestrator | 2026-04-20 00:59:45.959056 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2026-04-20 00:59:45.959063 | orchestrator | Monday 20 April 2026 00:58:03 +0000 (0:00:00.536) 0:00:21.378 ********** 2026-04-20 00:59:45.959070 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959077 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.959083 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.959090 | orchestrator | 2026-04-20 00:59:45.959097 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2026-04-20 00:59:45.959103 | orchestrator | Monday 20 April 2026 00:58:03 +0000 (0:00:00.302) 0:00:21.680 ********** 2026-04-20 00:59:45.959110 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959117 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.959123 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.959130 | orchestrator | 2026-04-20 00:59:45.959136 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2026-04-20 00:59:45.959144 | orchestrator | Monday 20 April 2026 00:58:03 +0000 (0:00:00.265) 0:00:21.946 ********** 2026-04-20 00:59:45.959150 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959156 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.959162 | orchestrator | skipping: [testbed-node-5] 2026-04-20 00:59:45.959168 | orchestrator | 2026-04-20 00:59:45.959174 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2026-04-20 00:59:45.959180 | orchestrator | Monday 20 April 2026 00:58:04 +0000 (0:00:00.266) 0:00:22.213 ********** 2026-04-20 00:59:45.959191 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.959198 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.959204 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.959211 | orchestrator | 2026-04-20 00:59:45.959217 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2026-04-20 00:59:45.959223 | orchestrator | Monday 20 April 2026 00:58:04 +0000 (0:00:00.482) 0:00:22.695 ********** 2026-04-20 00:59:45.959229 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:59:45.959239 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:59:45.959245 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:59:45.959252 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959258 | orchestrator | 2026-04-20 00:59:45.959264 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2026-04-20 00:59:45.959270 | orchestrator | Monday 20 April 2026 00:58:04 +0000 (0:00:00.326) 0:00:23.021 ********** 2026-04-20 00:59:45.959276 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:59:45.959283 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:59:45.959289 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:59:45.959295 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959301 | orchestrator | 2026-04-20 00:59:45.959308 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2026-04-20 00:59:45.959321 | orchestrator | Monday 20 April 2026 00:58:05 +0000 (0:00:00.336) 0:00:23.357 ********** 2026-04-20 00:59:45.959328 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2026-04-20 00:59:45.959336 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2026-04-20 00:59:45.959342 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2026-04-20 00:59:45.959349 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959355 | orchestrator | 2026-04-20 00:59:45.959362 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2026-04-20 00:59:45.959368 | orchestrator | Monday 20 April 2026 00:58:05 +0000 (0:00:00.325) 0:00:23.682 ********** 2026-04-20 00:59:45.959374 | orchestrator | ok: [testbed-node-3] 2026-04-20 00:59:45.959381 | orchestrator | ok: [testbed-node-4] 2026-04-20 00:59:45.959388 | orchestrator | ok: [testbed-node-5] 2026-04-20 00:59:45.959394 | orchestrator | 2026-04-20 00:59:45.959422 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2026-04-20 00:59:45.959429 | orchestrator | Monday 20 April 2026 00:58:05 +0000 (0:00:00.289) 0:00:23.972 ********** 2026-04-20 00:59:45.959436 | orchestrator | ok: [testbed-node-3] => (item=0) 2026-04-20 00:59:45.959443 | orchestrator | ok: [testbed-node-4] => (item=0) 2026-04-20 00:59:45.959449 | orchestrator | ok: [testbed-node-5] => (item=0) 2026-04-20 00:59:45.959455 | orchestrator | 2026-04-20 00:59:45.959461 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2026-04-20 00:59:45.959467 | orchestrator | Monday 20 April 2026 00:58:06 +0000 (0:00:00.507) 0:00:24.479 ********** 2026-04-20 00:59:45.959473 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:59:45.959479 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:59:45.959485 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:59:45.959491 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-20 00:59:45.959497 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-20 00:59:45.959503 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-20 00:59:45.959509 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-20 00:59:45.959515 | orchestrator | 2026-04-20 00:59:45.959521 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2026-04-20 00:59:45.959527 | orchestrator | Monday 20 April 2026 00:58:07 +0000 (0:00:00.853) 0:00:25.333 ********** 2026-04-20 00:59:45.959534 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2026-04-20 00:59:45.959541 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2026-04-20 00:59:45.959547 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2026-04-20 00:59:45.959553 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2026-04-20 00:59:45.959560 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2026-04-20 00:59:45.959567 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2026-04-20 00:59:45.959573 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2026-04-20 00:59:45.959580 | orchestrator | 2026-04-20 00:59:45.959586 | orchestrator | TASK [Include tasks from the ceph-osd role] ************************************ 2026-04-20 00:59:45.959592 | orchestrator | Monday 20 April 2026 00:58:08 +0000 (0:00:01.623) 0:00:26.956 ********** 2026-04-20 00:59:45.959599 | orchestrator | skipping: [testbed-node-3] 2026-04-20 00:59:45.959605 | orchestrator | skipping: [testbed-node-4] 2026-04-20 00:59:45.959612 | orchestrator | included: /ansible/tasks/openstack_config.yml for testbed-node-5 2026-04-20 00:59:45.959618 | orchestrator | 2026-04-20 00:59:45.959625 | orchestrator | TASK [create openstack pool(s)] ************************************************ 2026-04-20 00:59:45.959638 | orchestrator | Monday 20 April 2026 00:58:09 +0000 (0:00:00.323) 0:00:27.279 ********** 2026-04-20 00:59:45.959653 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'backups', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:59:45.959661 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'volumes', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:59:45.959673 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'images', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:59:45.959679 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'metrics', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:59:45.959687 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'vms', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2026-04-20 00:59:45.959693 | orchestrator | 2026-04-20 00:59:45.959700 | orchestrator | TASK [generate keys] *********************************************************** 2026-04-20 00:59:45.959706 | orchestrator | Monday 20 April 2026 00:58:53 +0000 (0:00:43.865) 0:01:11.144 ********** 2026-04-20 00:59:45.959712 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959718 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959724 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959730 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959737 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959744 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959751 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] }}] 2026-04-20 00:59:45.959757 | orchestrator | 2026-04-20 00:59:45.959764 | orchestrator | TASK [get keys from monitors] ************************************************** 2026-04-20 00:59:45.959771 | orchestrator | Monday 20 April 2026 00:59:16 +0000 (0:00:23.079) 0:01:34.224 ********** 2026-04-20 00:59:45.959778 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959785 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959791 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959798 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959805 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959812 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959820 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2026-04-20 00:59:45.959827 | orchestrator | 2026-04-20 00:59:45.959833 | orchestrator | TASK [copy ceph key(s) if needed] ********************************************** 2026-04-20 00:59:45.959840 | orchestrator | Monday 20 April 2026 00:59:27 +0000 (0:00:11.401) 0:01:45.625 ********** 2026-04-20 00:59:45.959847 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959858 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:59:45.959865 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:59:45.959872 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959878 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:59:45.959885 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:59:45.959891 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959898 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:59:45.959905 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:59:45.959911 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959918 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:59:45.959924 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:59:45.959931 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959938 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:59:45.959950 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:59:45.959958 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2026-04-20 00:59:45.959965 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2026-04-20 00:59:45.959972 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2026-04-20 00:59:45.959979 | orchestrator | changed: [testbed-node-5 -> {{ item.1 }}] 2026-04-20 00:59:45.959986 | orchestrator | 2026-04-20 00:59:45.959997 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 00:59:45.960005 | orchestrator | testbed-node-3 : ok=25  changed=0 unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2026-04-20 00:59:45.960013 | orchestrator | testbed-node-4 : ok=18  changed=0 unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2026-04-20 00:59:45.960020 | orchestrator | testbed-node-5 : ok=23  changed=3  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2026-04-20 00:59:45.960027 | orchestrator | 2026-04-20 00:59:45.960035 | orchestrator | 2026-04-20 00:59:45.960042 | orchestrator | 2026-04-20 00:59:45.960049 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 00:59:45.960056 | orchestrator | Monday 20 April 2026 00:59:43 +0000 (0:00:16.405) 0:02:02.031 ********** 2026-04-20 00:59:45.960063 | orchestrator | =============================================================================== 2026-04-20 00:59:45.960069 | orchestrator | create openstack pool(s) ----------------------------------------------- 43.87s 2026-04-20 00:59:45.960075 | orchestrator | generate keys ---------------------------------------------------------- 23.08s 2026-04-20 00:59:45.960081 | orchestrator | copy ceph key(s) if needed --------------------------------------------- 16.41s 2026-04-20 00:59:45.960087 | orchestrator | get keys from monitors ------------------------------------------------- 11.40s 2026-04-20 00:59:45.960093 | orchestrator | ceph-facts : Find a running mon container ------------------------------- 2.66s 2026-04-20 00:59:45.960100 | orchestrator | ceph-facts : Read osd pool default crush rule --------------------------- 1.69s 2026-04-20 00:59:45.960107 | orchestrator | ceph-facts : Set_fact ceph_admin_command -------------------------------- 1.62s 2026-04-20 00:59:45.960114 | orchestrator | ceph-facts : Get current fsid if cluster is already running ------------- 1.51s 2026-04-20 00:59:45.960125 | orchestrator | ceph-facts : Check if it is atomic host --------------------------------- 1.11s 2026-04-20 00:59:45.960133 | orchestrator | ceph-facts : Set_fact ceph_run_cmd -------------------------------------- 0.85s 2026-04-20 00:59:45.960140 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 0.72s 2026-04-20 00:59:45.960147 | orchestrator | ceph-facts : Check if podman binary is present -------------------------- 0.71s 2026-04-20 00:59:45.960154 | orchestrator | ceph-facts : Check if the ceph conf exists ------------------------------ 0.71s 2026-04-20 00:59:45.960161 | orchestrator | ceph-facts : Include facts.yml ------------------------------------------ 0.66s 2026-04-20 00:59:45.960169 | orchestrator | ceph-facts : Check if the ceph mon socket is in-use --------------------- 0.65s 2026-04-20 00:59:45.960176 | orchestrator | ceph-facts : Set_fact monitor_name ansible_facts['hostname'] ------------ 0.57s 2026-04-20 00:59:45.960183 | orchestrator | ceph-facts : Import_tasks set_radosgw_address.yml ----------------------- 0.54s 2026-04-20 00:59:45.960190 | orchestrator | ceph-facts : Set_fact rgw_instances ------------------------------------- 0.51s 2026-04-20 00:59:45.960197 | orchestrator | ceph-facts : Set_fact devices generate device list when osd_auto_discovery --- 0.51s 2026-04-20 00:59:45.960204 | orchestrator | ceph-facts : Set_fact _radosgw_address to radosgw_address --------------- 0.48s 2026-04-20 00:59:45.960212 | orchestrator | 2026-04-20 00:59:45 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 00:59:45.960220 | orchestrator | 2026-04-20 00:59:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:49.007908 | orchestrator | 2026-04-20 00:59:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:49.009690 | orchestrator | 2026-04-20 00:59:49 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 00:59:49.009740 | orchestrator | 2026-04-20 00:59:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:52.060391 | orchestrator | 2026-04-20 00:59:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:52.060624 | orchestrator | 2026-04-20 00:59:52 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 00:59:52.060639 | orchestrator | 2026-04-20 00:59:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:55.101945 | orchestrator | 2026-04-20 00:59:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:55.103556 | orchestrator | 2026-04-20 00:59:55 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 00:59:55.103624 | orchestrator | 2026-04-20 00:59:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 00:59:58.148302 | orchestrator | 2026-04-20 00:59:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 00:59:58.150381 | orchestrator | 2026-04-20 00:59:58 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 00:59:58.150460 | orchestrator | 2026-04-20 00:59:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:01.193446 | orchestrator | 2026-04-20 01:00:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:01.194544 | orchestrator | 2026-04-20 01:00:01 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:01.194613 | orchestrator | 2026-04-20 01:00:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:04.237861 | orchestrator | 2026-04-20 01:00:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:04.239824 | orchestrator | 2026-04-20 01:00:04 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:04.240172 | orchestrator | 2026-04-20 01:00:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:07.275575 | orchestrator | 2026-04-20 01:00:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:07.277510 | orchestrator | 2026-04-20 01:00:07 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:07.277577 | orchestrator | 2026-04-20 01:00:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:10.329927 | orchestrator | 2026-04-20 01:00:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:10.330516 | orchestrator | 2026-04-20 01:00:10 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:10.330538 | orchestrator | 2026-04-20 01:00:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:13.380222 | orchestrator | 2026-04-20 01:00:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:13.383136 | orchestrator | 2026-04-20 01:00:13 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:13.383221 | orchestrator | 2026-04-20 01:00:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:16.429441 | orchestrator | 2026-04-20 01:00:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:16.431296 | orchestrator | 2026-04-20 01:00:16 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:16.431419 | orchestrator | 2026-04-20 01:00:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:19.472209 | orchestrator | 2026-04-20 01:00:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:19.472427 | orchestrator | 2026-04-20 01:00:19 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state STARTED 2026-04-20 01:00:19.472443 | orchestrator | 2026-04-20 01:00:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:22.510524 | orchestrator | 2026-04-20 01:00:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:22.512003 | orchestrator | 2026-04-20 01:00:22 | INFO  | Task 3166ca87-6560-4ce1-8bfa-f0d0168f47b4 is in state SUCCESS 2026-04-20 01:00:22.514473 | orchestrator | 2026-04-20 01:00:22 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:22.514546 | orchestrator | 2026-04-20 01:00:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:25.553992 | orchestrator | 2026-04-20 01:00:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:25.555586 | orchestrator | 2026-04-20 01:00:25 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:25.555659 | orchestrator | 2026-04-20 01:00:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:28.592232 | orchestrator | 2026-04-20 01:00:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:28.592849 | orchestrator | 2026-04-20 01:00:28 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:28.592904 | orchestrator | 2026-04-20 01:00:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:31.633131 | orchestrator | 2026-04-20 01:00:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:31.635148 | orchestrator | 2026-04-20 01:00:31 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:31.635202 | orchestrator | 2026-04-20 01:00:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:34.665150 | orchestrator | 2026-04-20 01:00:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:34.665542 | orchestrator | 2026-04-20 01:00:34 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:34.665574 | orchestrator | 2026-04-20 01:00:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:37.703209 | orchestrator | 2026-04-20 01:00:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:37.706790 | orchestrator | 2026-04-20 01:00:37 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:37.706859 | orchestrator | 2026-04-20 01:00:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:40.756995 | orchestrator | 2026-04-20 01:00:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:40.760506 | orchestrator | 2026-04-20 01:00:40 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:40.760633 | orchestrator | 2026-04-20 01:00:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:43.794256 | orchestrator | 2026-04-20 01:00:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:43.794658 | orchestrator | 2026-04-20 01:00:43 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:43.794675 | orchestrator | 2026-04-20 01:00:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:46.828101 | orchestrator | 2026-04-20 01:00:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:46.829270 | orchestrator | 2026-04-20 01:00:46 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:46.829305 | orchestrator | 2026-04-20 01:00:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:49.870521 | orchestrator | 2026-04-20 01:00:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:49.872837 | orchestrator | 2026-04-20 01:00:49 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:49.872888 | orchestrator | 2026-04-20 01:00:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:52.913213 | orchestrator | 2026-04-20 01:00:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:52.914704 | orchestrator | 2026-04-20 01:00:52 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:52.914741 | orchestrator | 2026-04-20 01:00:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:55.955932 | orchestrator | 2026-04-20 01:00:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:55.957461 | orchestrator | 2026-04-20 01:00:55 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:55.957511 | orchestrator | 2026-04-20 01:00:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:00:59.001196 | orchestrator | 2026-04-20 01:00:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:00:59.003127 | orchestrator | 2026-04-20 01:00:59 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:00:59.003207 | orchestrator | 2026-04-20 01:00:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:02.052749 | orchestrator | 2026-04-20 01:01:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:02.055419 | orchestrator | 2026-04-20 01:01:02 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:01:02.055515 | orchestrator | 2026-04-20 01:01:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:05.094119 | orchestrator | 2026-04-20 01:01:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:05.096692 | orchestrator | 2026-04-20 01:01:05 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:01:05.096804 | orchestrator | 2026-04-20 01:01:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:08.133240 | orchestrator | 2026-04-20 01:01:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:08.134674 | orchestrator | 2026-04-20 01:01:08 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:01:08.134730 | orchestrator | 2026-04-20 01:01:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:11.174856 | orchestrator | 2026-04-20 01:01:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:11.177784 | orchestrator | 2026-04-20 01:01:11 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:01:11.177849 | orchestrator | 2026-04-20 01:01:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:14.215788 | orchestrator | 2026-04-20 01:01:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:14.218004 | orchestrator | 2026-04-20 01:01:14 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state STARTED 2026-04-20 01:01:14.218122 | orchestrator | 2026-04-20 01:01:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:17.249914 | orchestrator | 2026-04-20 01:01:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:17.250879 | orchestrator | 2026-04-20 01:01:17 | INFO  | Task 253f9744-aad2-4b78-a88f-2c64f2778386 is in state SUCCESS 2026-04-20 01:01:17.251201 | orchestrator | 2026-04-20 01:01:17.251223 | orchestrator | 2026-04-20 01:01:17.251228 | orchestrator | PLAY [Copy ceph keys to the configuration repository] ************************** 2026-04-20 01:01:17.251233 | orchestrator | 2026-04-20 01:01:17.251237 | orchestrator | TASK [Check if ceph keys exist] ************************************************ 2026-04-20 01:01:17.251242 | orchestrator | Monday 20 April 2026 00:59:47 +0000 (0:00:00.232) 0:00:00.232 ********** 2026-04-20 01:01:17.251246 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2026-04-20 01:01:17.251252 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251255 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251259 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2026-04-20 01:01:17.251263 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251267 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2026-04-20 01:01:17.251312 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2026-04-20 01:01:17.251318 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2026-04-20 01:01:17.251324 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2026-04-20 01:01:17.251330 | orchestrator | 2026-04-20 01:01:17.251339 | orchestrator | TASK [Fetch all ceph keys] ***************************************************** 2026-04-20 01:01:17.251347 | orchestrator | Monday 20 April 2026 00:59:52 +0000 (0:00:04.883) 0:00:05.116 ********** 2026-04-20 01:01:17.251353 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2026-04-20 01:01:17.251359 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251387 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251394 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2026-04-20 01:01:17.251401 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251407 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2026-04-20 01:01:17.251414 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2026-04-20 01:01:17.251420 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2026-04-20 01:01:17.251426 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2026-04-20 01:01:17.251432 | orchestrator | 2026-04-20 01:01:17.251439 | orchestrator | TASK [Create share directory] ************************************************** 2026-04-20 01:01:17.251445 | orchestrator | Monday 20 April 2026 00:59:56 +0000 (0:00:04.144) 0:00:09.260 ********** 2026-04-20 01:01:17.251453 | orchestrator | changed: [testbed-manager -> localhost] 2026-04-20 01:01:17.251459 | orchestrator | 2026-04-20 01:01:17.251467 | orchestrator | TASK [Write ceph keys to the share directory] ********************************** 2026-04-20 01:01:17.251472 | orchestrator | Monday 20 April 2026 00:59:57 +0000 (0:00:01.024) 0:00:10.285 ********** 2026-04-20 01:01:17.251477 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.admin.keyring) 2026-04-20 01:01:17.251481 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251485 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251489 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder-backup.keyring) 2026-04-20 01:01:17.251493 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251497 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.nova.keyring) 2026-04-20 01:01:17.251501 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.glance.keyring) 2026-04-20 01:01:17.251504 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.gnocchi.keyring) 2026-04-20 01:01:17.251508 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.manila.keyring) 2026-04-20 01:01:17.251512 | orchestrator | 2026-04-20 01:01:17.251516 | orchestrator | TASK [Check if target directories exist] *************************************** 2026-04-20 01:01:17.251519 | orchestrator | Monday 20 April 2026 01:00:11 +0000 (0:00:13.298) 0:00:23.584 ********** 2026-04-20 01:01:17.251523 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/infrastructure/files/ceph) 2026-04-20 01:01:17.251539 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-volume) 2026-04-20 01:01:17.251547 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2026-04-20 01:01:17.251555 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2026-04-20 01:01:17.251573 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2026-04-20 01:01:17.251580 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2026-04-20 01:01:17.251618 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/glance) 2026-04-20 01:01:17.251625 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/gnocchi) 2026-04-20 01:01:17.251631 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/manila) 2026-04-20 01:01:17.251637 | orchestrator | 2026-04-20 01:01:17.251651 | orchestrator | TASK [Write ceph keys to the configuration directory] ************************** 2026-04-20 01:01:17.251658 | orchestrator | Monday 20 April 2026 01:00:13 +0000 (0:00:02.960) 0:00:26.544 ********** 2026-04-20 01:01:17.251665 | orchestrator | changed: [testbed-manager] => (item=ceph.client.admin.keyring) 2026-04-20 01:01:17.251671 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251677 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251683 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder-backup.keyring) 2026-04-20 01:01:17.251689 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2026-04-20 01:01:17.251696 | orchestrator | changed: [testbed-manager] => (item=ceph.client.nova.keyring) 2026-04-20 01:01:17.251701 | orchestrator | changed: [testbed-manager] => (item=ceph.client.glance.keyring) 2026-04-20 01:01:17.251705 | orchestrator | changed: [testbed-manager] => (item=ceph.client.gnocchi.keyring) 2026-04-20 01:01:17.251709 | orchestrator | changed: [testbed-manager] => (item=ceph.client.manila.keyring) 2026-04-20 01:01:17.251713 | orchestrator | 2026-04-20 01:01:17.251717 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 01:01:17.251720 | orchestrator | testbed-manager : ok=6  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2026-04-20 01:01:17.251726 | orchestrator | 2026-04-20 01:01:17.251731 | orchestrator | 2026-04-20 01:01:17.251737 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 01:01:17.251742 | orchestrator | Monday 20 April 2026 01:00:20 +0000 (0:00:06.231) 0:00:32.775 ********** 2026-04-20 01:01:17.251748 | orchestrator | =============================================================================== 2026-04-20 01:01:17.251755 | orchestrator | Write ceph keys to the share directory --------------------------------- 13.30s 2026-04-20 01:01:17.251761 | orchestrator | Write ceph keys to the configuration directory -------------------------- 6.23s 2026-04-20 01:01:17.251767 | orchestrator | Check if ceph keys exist ------------------------------------------------ 4.88s 2026-04-20 01:01:17.251774 | orchestrator | Fetch all ceph keys ----------------------------------------------------- 4.14s 2026-04-20 01:01:17.251780 | orchestrator | Check if target directories exist --------------------------------------- 2.96s 2026-04-20 01:01:17.251786 | orchestrator | Create share directory -------------------------------------------------- 1.02s 2026-04-20 01:01:17.251792 | orchestrator | 2026-04-20 01:01:17.251799 | orchestrator | 2026-04-20 01:01:17.251805 | orchestrator | PLAY [Apply role cephclient] *************************************************** 2026-04-20 01:01:17.251811 | orchestrator | 2026-04-20 01:01:17.251818 | orchestrator | TASK [osism.services.cephclient : Include container tasks] ********************* 2026-04-20 01:01:17.251822 | orchestrator | Monday 20 April 2026 01:00:23 +0000 (0:00:00.260) 0:00:00.260 ********** 2026-04-20 01:01:17.251826 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/cephclient/tasks/container.yml for testbed-manager 2026-04-20 01:01:17.251832 | orchestrator | 2026-04-20 01:01:17.251836 | orchestrator | TASK [osism.services.cephclient : Create required directories] ***************** 2026-04-20 01:01:17.251839 | orchestrator | Monday 20 April 2026 01:00:23 +0000 (0:00:00.178) 0:00:00.438 ********** 2026-04-20 01:01:17.251843 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/configuration) 2026-04-20 01:01:17.251847 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/data) 2026-04-20 01:01:17.251852 | orchestrator | ok: [testbed-manager] => (item=/opt/cephclient) 2026-04-20 01:01:17.251855 | orchestrator | 2026-04-20 01:01:17.251859 | orchestrator | TASK [osism.services.cephclient : Copy configuration files] ******************** 2026-04-20 01:01:17.251863 | orchestrator | Monday 20 April 2026 01:00:24 +0000 (0:00:01.355) 0:00:01.794 ********** 2026-04-20 01:01:17.251867 | orchestrator | changed: [testbed-manager] => (item={'src': 'ceph.conf.j2', 'dest': '/opt/cephclient/configuration/ceph.conf'}) 2026-04-20 01:01:17.251871 | orchestrator | 2026-04-20 01:01:17.251879 | orchestrator | TASK [osism.services.cephclient : Copy keyring file] *************************** 2026-04-20 01:01:17.251883 | orchestrator | Monday 20 April 2026 01:00:25 +0000 (0:00:01.061) 0:00:02.856 ********** 2026-04-20 01:01:17.251887 | orchestrator | changed: [testbed-manager] 2026-04-20 01:01:17.251891 | orchestrator | 2026-04-20 01:01:17.251895 | orchestrator | TASK [osism.services.cephclient : Copy docker-compose.yml file] **************** 2026-04-20 01:01:17.251898 | orchestrator | Monday 20 April 2026 01:00:26 +0000 (0:00:00.832) 0:00:03.689 ********** 2026-04-20 01:01:17.251902 | orchestrator | changed: [testbed-manager] 2026-04-20 01:01:17.251906 | orchestrator | 2026-04-20 01:01:17.251915 | orchestrator | TASK [osism.services.cephclient : Manage cephclient service] ******************* 2026-04-20 01:01:17.251919 | orchestrator | Monday 20 April 2026 01:00:27 +0000 (0:00:00.757) 0:00:04.447 ********** 2026-04-20 01:01:17.251923 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage cephclient service (10 retries left). 2026-04-20 01:01:17.251927 | orchestrator | ok: [testbed-manager] 2026-04-20 01:01:17.251931 | orchestrator | 2026-04-20 01:01:17.251935 | orchestrator | TASK [osism.services.cephclient : Copy wrapper scripts] ************************ 2026-04-20 01:01:17.251944 | orchestrator | Monday 20 April 2026 01:01:07 +0000 (0:00:40.182) 0:00:44.629 ********** 2026-04-20 01:01:17.251949 | orchestrator | changed: [testbed-manager] => (item=ceph) 2026-04-20 01:01:17.251953 | orchestrator | changed: [testbed-manager] => (item=ceph-authtool) 2026-04-20 01:01:17.251957 | orchestrator | changed: [testbed-manager] => (item=rados) 2026-04-20 01:01:17.251960 | orchestrator | changed: [testbed-manager] => (item=radosgw-admin) 2026-04-20 01:01:17.251964 | orchestrator | changed: [testbed-manager] => (item=rbd) 2026-04-20 01:01:17.251968 | orchestrator | 2026-04-20 01:01:17.251972 | orchestrator | TASK [osism.services.cephclient : Remove old wrapper scripts] ****************** 2026-04-20 01:01:17.251975 | orchestrator | Monday 20 April 2026 01:01:11 +0000 (0:00:03.655) 0:00:48.284 ********** 2026-04-20 01:01:17.251979 | orchestrator | ok: [testbed-manager] => (item=crushtool) 2026-04-20 01:01:17.251983 | orchestrator | 2026-04-20 01:01:17.251990 | orchestrator | TASK [osism.services.cephclient : Include package tasks] *********************** 2026-04-20 01:01:17.251995 | orchestrator | Monday 20 April 2026 01:01:11 +0000 (0:00:00.490) 0:00:48.775 ********** 2026-04-20 01:01:17.252001 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:01:17.252009 | orchestrator | 2026-04-20 01:01:17.252019 | orchestrator | TASK [osism.services.cephclient : Include rook task] *************************** 2026-04-20 01:01:17.252024 | orchestrator | Monday 20 April 2026 01:01:12 +0000 (0:00:00.119) 0:00:48.895 ********** 2026-04-20 01:01:17.252030 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:01:17.252036 | orchestrator | 2026-04-20 01:01:17.252041 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Restart cephclient service] ******* 2026-04-20 01:01:17.252046 | orchestrator | Monday 20 April 2026 01:01:12 +0000 (0:00:00.276) 0:00:49.172 ********** 2026-04-20 01:01:17.252052 | orchestrator | changed: [testbed-manager] 2026-04-20 01:01:17.252058 | orchestrator | 2026-04-20 01:01:17.252064 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Ensure that all containers are up] *** 2026-04-20 01:01:17.252069 | orchestrator | Monday 20 April 2026 01:01:13 +0000 (0:00:01.253) 0:00:50.425 ********** 2026-04-20 01:01:17.252075 | orchestrator | changed: [testbed-manager] 2026-04-20 01:01:17.252081 | orchestrator | 2026-04-20 01:01:17.252087 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Wait for an healthy service] ****** 2026-04-20 01:01:17.252094 | orchestrator | Monday 20 April 2026 01:01:14 +0000 (0:00:00.638) 0:00:51.063 ********** 2026-04-20 01:01:17.252100 | orchestrator | changed: [testbed-manager] 2026-04-20 01:01:17.252106 | orchestrator | 2026-04-20 01:01:17.252111 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Copy bash completion scripts] ***** 2026-04-20 01:01:17.252118 | orchestrator | Monday 20 April 2026 01:01:14 +0000 (0:00:00.495) 0:00:51.558 ********** 2026-04-20 01:01:17.252124 | orchestrator | ok: [testbed-manager] => (item=ceph) 2026-04-20 01:01:17.252129 | orchestrator | ok: [testbed-manager] => (item=rados) 2026-04-20 01:01:17.252135 | orchestrator | ok: [testbed-manager] => (item=radosgw-admin) 2026-04-20 01:01:17.252148 | orchestrator | ok: [testbed-manager] => (item=rbd) 2026-04-20 01:01:17.252154 | orchestrator | 2026-04-20 01:01:17.252161 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 01:01:17.252167 | orchestrator | testbed-manager : ok=12  changed=8  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2026-04-20 01:01:17.252173 | orchestrator | 2026-04-20 01:01:17.252179 | orchestrator | 2026-04-20 01:01:17.252186 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 01:01:17.252192 | orchestrator | Monday 20 April 2026 01:01:16 +0000 (0:00:01.318) 0:00:52.877 ********** 2026-04-20 01:01:17.252198 | orchestrator | =============================================================================== 2026-04-20 01:01:17.252203 | orchestrator | osism.services.cephclient : Manage cephclient service ------------------ 40.18s 2026-04-20 01:01:17.252210 | orchestrator | osism.services.cephclient : Copy wrapper scripts ------------------------ 3.66s 2026-04-20 01:01:17.252215 | orchestrator | osism.services.cephclient : Create required directories ----------------- 1.36s 2026-04-20 01:01:17.252219 | orchestrator | osism.services.cephclient : Copy bash completion scripts ---------------- 1.32s 2026-04-20 01:01:17.252225 | orchestrator | osism.services.cephclient : Restart cephclient service ------------------ 1.25s 2026-04-20 01:01:17.252231 | orchestrator | osism.services.cephclient : Copy configuration files -------------------- 1.06s 2026-04-20 01:01:17.252237 | orchestrator | osism.services.cephclient : Copy keyring file --------------------------- 0.83s 2026-04-20 01:01:17.252242 | orchestrator | osism.services.cephclient : Copy docker-compose.yml file ---------------- 0.76s 2026-04-20 01:01:17.252248 | orchestrator | osism.services.cephclient : Ensure that all containers are up ----------- 0.64s 2026-04-20 01:01:17.252255 | orchestrator | osism.services.cephclient : Wait for an healthy service ----------------- 0.50s 2026-04-20 01:01:17.252261 | orchestrator | osism.services.cephclient : Remove old wrapper scripts ------------------ 0.49s 2026-04-20 01:01:17.252267 | orchestrator | osism.services.cephclient : Include rook task --------------------------- 0.28s 2026-04-20 01:01:17.252274 | orchestrator | osism.services.cephclient : Include container tasks --------------------- 0.18s 2026-04-20 01:01:17.252279 | orchestrator | osism.services.cephclient : Include package tasks ----------------------- 0.12s 2026-04-20 01:01:17.252285 | orchestrator | 2026-04-20 01:01:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:20.296423 | orchestrator | 2026-04-20 01:01:20 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:20.296510 | orchestrator | 2026-04-20 01:01:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:20.296519 | orchestrator | 2026-04-20 01:01:20 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:20.296526 | orchestrator | 2026-04-20 01:01:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:20.296534 | orchestrator | 2026-04-20 01:01:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:23.328233 | orchestrator | 2026-04-20 01:01:23 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:23.331038 | orchestrator | 2026-04-20 01:01:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:23.333769 | orchestrator | 2026-04-20 01:01:23 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:23.335335 | orchestrator | 2026-04-20 01:01:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:23.335375 | orchestrator | 2026-04-20 01:01:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:26.381143 | orchestrator | 2026-04-20 01:01:26 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:26.382984 | orchestrator | 2026-04-20 01:01:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:26.384058 | orchestrator | 2026-04-20 01:01:26 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:26.386568 | orchestrator | 2026-04-20 01:01:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:26.386640 | orchestrator | 2026-04-20 01:01:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:29.420767 | orchestrator | 2026-04-20 01:01:29 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:29.421185 | orchestrator | 2026-04-20 01:01:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:29.422182 | orchestrator | 2026-04-20 01:01:29 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:29.423227 | orchestrator | 2026-04-20 01:01:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:29.423253 | orchestrator | 2026-04-20 01:01:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:32.463885 | orchestrator | 2026-04-20 01:01:32 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:32.465622 | orchestrator | 2026-04-20 01:01:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:32.466196 | orchestrator | 2026-04-20 01:01:32 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:32.466922 | orchestrator | 2026-04-20 01:01:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:32.467055 | orchestrator | 2026-04-20 01:01:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:35.493009 | orchestrator | 2026-04-20 01:01:35 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:35.494538 | orchestrator | 2026-04-20 01:01:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:35.495404 | orchestrator | 2026-04-20 01:01:35 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:35.496694 | orchestrator | 2026-04-20 01:01:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:35.496732 | orchestrator | 2026-04-20 01:01:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:38.521693 | orchestrator | 2026-04-20 01:01:38 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:38.521773 | orchestrator | 2026-04-20 01:01:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:38.521782 | orchestrator | 2026-04-20 01:01:38 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:38.522967 | orchestrator | 2026-04-20 01:01:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:38.523031 | orchestrator | 2026-04-20 01:01:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:41.563887 | orchestrator | 2026-04-20 01:01:41 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:41.567352 | orchestrator | 2026-04-20 01:01:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:41.570702 | orchestrator | 2026-04-20 01:01:41 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:41.572945 | orchestrator | 2026-04-20 01:01:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:41.573011 | orchestrator | 2026-04-20 01:01:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:44.611992 | orchestrator | 2026-04-20 01:01:44 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:44.612051 | orchestrator | 2026-04-20 01:01:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:44.612835 | orchestrator | 2026-04-20 01:01:44 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:44.614127 | orchestrator | 2026-04-20 01:01:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:44.614164 | orchestrator | 2026-04-20 01:01:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:47.651819 | orchestrator | 2026-04-20 01:01:47 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:47.656469 | orchestrator | 2026-04-20 01:01:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:47.659615 | orchestrator | 2026-04-20 01:01:47 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:47.661923 | orchestrator | 2026-04-20 01:01:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:47.661972 | orchestrator | 2026-04-20 01:01:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:50.709909 | orchestrator | 2026-04-20 01:01:50 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:50.711708 | orchestrator | 2026-04-20 01:01:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:50.715410 | orchestrator | 2026-04-20 01:01:50 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:50.719728 | orchestrator | 2026-04-20 01:01:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:50.720216 | orchestrator | 2026-04-20 01:01:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:53.770482 | orchestrator | 2026-04-20 01:01:53 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:53.772738 | orchestrator | 2026-04-20 01:01:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:53.776373 | orchestrator | 2026-04-20 01:01:53 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:53.777345 | orchestrator | 2026-04-20 01:01:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:53.777573 | orchestrator | 2026-04-20 01:01:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:56.824880 | orchestrator | 2026-04-20 01:01:56 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:56.826541 | orchestrator | 2026-04-20 01:01:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:56.828285 | orchestrator | 2026-04-20 01:01:56 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:56.829375 | orchestrator | 2026-04-20 01:01:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:56.829471 | orchestrator | 2026-04-20 01:01:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:01:59.887681 | orchestrator | 2026-04-20 01:01:59 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:01:59.890053 | orchestrator | 2026-04-20 01:01:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:01:59.892649 | orchestrator | 2026-04-20 01:01:59 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:01:59.894920 | orchestrator | 2026-04-20 01:01:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:01:59.894965 | orchestrator | 2026-04-20 01:01:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:02.937178 | orchestrator | 2026-04-20 01:02:02 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:02.937706 | orchestrator | 2026-04-20 01:02:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:02.939406 | orchestrator | 2026-04-20 01:02:02 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:02.940073 | orchestrator | 2026-04-20 01:02:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:02.940100 | orchestrator | 2026-04-20 01:02:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:05.972506 | orchestrator | 2026-04-20 01:02:05 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:05.972561 | orchestrator | 2026-04-20 01:02:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:05.973356 | orchestrator | 2026-04-20 01:02:05 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:05.974754 | orchestrator | 2026-04-20 01:02:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:05.974782 | orchestrator | 2026-04-20 01:02:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:09.024409 | orchestrator | 2026-04-20 01:02:09 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:09.024475 | orchestrator | 2026-04-20 01:02:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:09.024481 | orchestrator | 2026-04-20 01:02:09 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:09.025945 | orchestrator | 2026-04-20 01:02:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:09.025987 | orchestrator | 2026-04-20 01:02:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:12.062691 | orchestrator | 2026-04-20 01:02:12 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:12.063723 | orchestrator | 2026-04-20 01:02:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:12.065058 | orchestrator | 2026-04-20 01:02:12 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:12.066221 | orchestrator | 2026-04-20 01:02:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:12.066271 | orchestrator | 2026-04-20 01:02:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:15.103951 | orchestrator | 2026-04-20 01:02:15 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:15.105786 | orchestrator | 2026-04-20 01:02:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:15.107618 | orchestrator | 2026-04-20 01:02:15 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:15.109034 | orchestrator | 2026-04-20 01:02:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:15.109087 | orchestrator | 2026-04-20 01:02:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:18.156937 | orchestrator | 2026-04-20 01:02:18 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:18.157168 | orchestrator | 2026-04-20 01:02:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:18.160398 | orchestrator | 2026-04-20 01:02:18 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:18.161099 | orchestrator | 2026-04-20 01:02:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:18.161140 | orchestrator | 2026-04-20 01:02:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:21.209247 | orchestrator | 2026-04-20 01:02:21 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:21.210083 | orchestrator | 2026-04-20 01:02:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:21.210933 | orchestrator | 2026-04-20 01:02:21 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state STARTED 2026-04-20 01:02:21.211747 | orchestrator | 2026-04-20 01:02:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:21.211775 | orchestrator | 2026-04-20 01:02:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:24.254154 | orchestrator | 2026-04-20 01:02:24 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:24.254218 | orchestrator | 2026-04-20 01:02:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:24.257303 | orchestrator | 2026-04-20 01:02:24 | INFO  | Task 7430aa20-1de8-4442-9104-87b57dc9ed0d is in state SUCCESS 2026-04-20 01:02:24.258683 | orchestrator | 2026-04-20 01:02:24.258725 | orchestrator | 2026-04-20 01:02:24.258731 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 01:02:24.258736 | orchestrator | 2026-04-20 01:02:24.258740 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 01:02:24.258745 | orchestrator | Monday 20 April 2026 01:01:19 +0000 (0:00:00.291) 0:00:00.291 ********** 2026-04-20 01:02:24.258749 | orchestrator | ok: [testbed-manager] 2026-04-20 01:02:24.258753 | orchestrator | ok: [testbed-node-0] 2026-04-20 01:02:24.258757 | orchestrator | ok: [testbed-node-1] 2026-04-20 01:02:24.258761 | orchestrator | ok: [testbed-node-2] 2026-04-20 01:02:24.258765 | orchestrator | ok: [testbed-node-3] 2026-04-20 01:02:24.258768 | orchestrator | ok: [testbed-node-4] 2026-04-20 01:02:24.258772 | orchestrator | ok: [testbed-node-5] 2026-04-20 01:02:24.258776 | orchestrator | 2026-04-20 01:02:24.258780 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 01:02:24.258784 | orchestrator | Monday 20 April 2026 01:01:19 +0000 (0:00:00.644) 0:00:00.935 ********** 2026-04-20 01:02:24.258788 | orchestrator | ok: [testbed-manager] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258792 | orchestrator | ok: [testbed-node-0] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258796 | orchestrator | ok: [testbed-node-1] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258800 | orchestrator | ok: [testbed-node-2] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258803 | orchestrator | ok: [testbed-node-3] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258807 | orchestrator | ok: [testbed-node-4] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258811 | orchestrator | ok: [testbed-node-5] => (item=enable_prometheus_True) 2026-04-20 01:02:24.258814 | orchestrator | 2026-04-20 01:02:24.258818 | orchestrator | PLAY [Apply role prometheus] *************************************************** 2026-04-20 01:02:24.258822 | orchestrator | 2026-04-20 01:02:24.258826 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2026-04-20 01:02:24.258830 | orchestrator | Monday 20 April 2026 01:01:20 +0000 (0:00:00.891) 0:00:01.827 ********** 2026-04-20 01:02:24.258834 | orchestrator | included: /ansible/roles/prometheus/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 01:02:24.258839 | orchestrator | 2026-04-20 01:02:24.258843 | orchestrator | TASK [prometheus : Ensuring config directories exist] ************************** 2026-04-20 01:02:24.258861 | orchestrator | Monday 20 April 2026 01:01:21 +0000 (0:00:01.107) 0:00:02.935 ********** 2026-04-20 01:02:24.258869 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 01:02:24.258877 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258882 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258901 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258906 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.258910 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258919 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258923 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258927 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.258932 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.258939 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.258947 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:24.258951 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.258961 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.258968 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.258975 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.258982 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259002 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259009 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259016 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259108 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259120 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259127 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259134 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259140 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259148 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259153 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259162 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259167 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259171 | orchestrator | 2026-04-20 01:02:24.259175 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2026-04-20 01:02:24.259350 | orchestrator | Monday 20 April 2026 01:01:25 +0000 (0:00:03.371) 0:00:06.306 ********** 2026-04-20 01:02:24.259356 | orchestrator | included: /ansible/roles/prometheus/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2026-04-20 01:02:24.259360 | orchestrator | 2026-04-20 01:02:24.259364 | orchestrator | TASK [service-cert-copy : prometheus | Copying over extra CA certificates] ***** 2026-04-20 01:02:24.259368 | orchestrator | Monday 20 April 2026 01:01:26 +0000 (0:00:01.144) 0:00:07.450 ********** 2026-04-20 01:02:24.259388 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 01:02:24.259399 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259404 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259416 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259425 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259433 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259440 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259446 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.259453 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259468 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259480 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259486 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259493 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259499 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259505 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259511 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259524 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259534 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.259541 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259547 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259553 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259560 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:24.259566 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.259683 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260001 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260013 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260018 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260022 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260026 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260030 | orchestrator | 2026-04-20 01:02:24.260035 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS certificate] *** 2026-04-20 01:02:24.260039 | orchestrator | Monday 20 April 2026 01:01:31 +0000 (0:00:04.958) 0:00:12.409 ********** 2026-04-20 01:02:24.260043 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260067 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-20 01:02:24.260073 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260078 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260082 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260086 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260090 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260093 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260113 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260118 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260122 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260126 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260130 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.260136 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260143 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260157 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.260166 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260190 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260198 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:24.260205 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260211 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260256 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260264 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.260304 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260314 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260415 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.260438 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260446 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260454 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260460 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.260468 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260474 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.260480 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260492 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260500 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260506 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.260512 | orchestrator | 2026-04-20 01:02:24.260516 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS key] *** 2026-04-20 01:02:24.260523 | orchestrator | Monday 20 April 2026 01:01:33 +0000 (0:00:01.755) 0:00:14.165 ********** 2026-04-20 01:02:24.260540 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-20 01:02:24.260545 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260549 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260553 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260560 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260565 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260594 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260599 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260604 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260608 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260612 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260619 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:24.260623 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260629 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.260643 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260647 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260651 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260655 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260662 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260666 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.260670 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260674 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.260678 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260694 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260698 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.260702 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260706 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260710 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.260717 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260721 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260725 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.260729 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260744 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.260749 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.260752 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.260756 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.260760 | orchestrator | 2026-04-20 01:02:24.260764 | orchestrator | TASK [prometheus : Copying over config.json files] ***************************** 2026-04-20 01:02:24.260768 | orchestrator | Monday 20 April 2026 01:01:35 +0000 (0:00:02.153) 0:00:16.318 ********** 2026-04-20 01:02:24.260773 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 01:02:24.260781 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260785 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260789 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260795 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260808 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260812 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260816 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.260823 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260827 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260831 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260835 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260850 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260855 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260859 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260865 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:24.260870 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260874 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260878 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260893 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260898 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260904 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260908 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260912 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260916 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260920 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.260935 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260940 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260946 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.260950 | orchestrator | 2026-04-20 01:02:24.260954 | orchestrator | TASK [prometheus : Find custom prometheus alert rules files] ******************* 2026-04-20 01:02:24.260958 | orchestrator | Monday 20 April 2026 01:01:41 +0000 (0:00:05.925) 0:00:22.244 ********** 2026-04-20 01:02:24.260962 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 01:02:24.260966 | orchestrator | 2026-04-20 01:02:24.260970 | orchestrator | TASK [prometheus : Copying over custom prometheus alert rules files] *********** 2026-04-20 01:02:24.260974 | orchestrator | Monday 20 April 2026 01:01:42 +0000 (0:00:00.861) 0:00:23.105 ********** 2026-04-20 01:02:24.260978 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.260982 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.260985 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.260989 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.260993 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.260997 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261001 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261005 | orchestrator | 2026-04-20 01:02:24.261009 | orchestrator | TASK [prometheus : Find prometheus common config overrides] ******************** 2026-04-20 01:02:24.261013 | orchestrator | Monday 20 April 2026 01:01:42 +0000 (0:00:00.659) 0:00:23.765 ********** 2026-04-20 01:02:24.261016 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 01:02:24.261020 | orchestrator | 2026-04-20 01:02:24.261024 | orchestrator | TASK [prometheus : Find prometheus host config overrides] ********************** 2026-04-20 01:02:24.261028 | orchestrator | Monday 20 April 2026 01:01:43 +0000 (0:00:00.711) 0:00:24.476 ********** 2026-04-20 01:02:24.261032 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261036 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261040 | orchestrator | manager/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261044 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261048 | orchestrator | manager/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261053 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 01:02:24.261057 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261062 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261067 | orchestrator | node-0/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261071 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261075 | orchestrator | node-0/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261080 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 01:02:24.261085 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261089 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261094 | orchestrator | node-1/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261098 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261103 | orchestrator | node-1/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261108 | orchestrator | ok: [testbed-node-1 -> localhost] 2026-04-20 01:02:24.261112 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261117 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261123 | orchestrator | node-2/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261128 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261132 | orchestrator | node-2/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261137 | orchestrator | ok: [testbed-node-2 -> localhost] 2026-04-20 01:02:24.261141 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261146 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261150 | orchestrator | node-3/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261155 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261161 | orchestrator | node-3/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261166 | orchestrator | ok: [testbed-node-3 -> localhost] 2026-04-20 01:02:24.261180 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261185 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261190 | orchestrator | node-5/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261194 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261198 | orchestrator | node-5/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261203 | orchestrator | ok: [testbed-node-5 -> localhost] 2026-04-20 01:02:24.261207 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261212 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261217 | orchestrator | node-4/prometheus.yml.d' path due to this access issue: 2026-04-20 01:02:24.261221 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2026-04-20 01:02:24.261226 | orchestrator | node-4/prometheus.yml.d' is not a directory 2026-04-20 01:02:24.261230 | orchestrator | ok: [testbed-node-4 -> localhost] 2026-04-20 01:02:24.261234 | orchestrator | 2026-04-20 01:02:24.261239 | orchestrator | TASK [prometheus : Copying over prometheus config file] ************************ 2026-04-20 01:02:24.261244 | orchestrator | Monday 20 April 2026 01:01:44 +0000 (0:00:01.485) 0:00:25.961 ********** 2026-04-20 01:02:24.261249 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-20 01:02:24.261253 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261258 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-20 01:02:24.261262 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261266 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-20 01:02:24.261271 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261275 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-20 01:02:24.261280 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261284 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-20 01:02:24.261288 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261293 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2026-04-20 01:02:24.261297 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261301 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2) 2026-04-20 01:02:24.261306 | orchestrator | 2026-04-20 01:02:24.261310 | orchestrator | TASK [prometheus : Copying over prometheus web config file] ******************** 2026-04-20 01:02:24.261315 | orchestrator | Monday 20 April 2026 01:01:56 +0000 (0:00:11.872) 0:00:37.833 ********** 2026-04-20 01:02:24.261319 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-20 01:02:24.261324 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261328 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-20 01:02:24.261335 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261340 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-20 01:02:24.261344 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261349 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-20 01:02:24.261353 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261358 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-20 01:02:24.261362 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261366 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2026-04-20 01:02:24.261370 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261375 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2) 2026-04-20 01:02:24.261379 | orchestrator | 2026-04-20 01:02:24.261384 | orchestrator | TASK [prometheus : Copying over prometheus alertmanager config file] *********** 2026-04-20 01:02:24.261388 | orchestrator | Monday 20 April 2026 01:01:59 +0000 (0:00:03.099) 0:00:40.932 ********** 2026-04-20 01:02:24.261393 | orchestrator | skipping: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-20 01:02:24.261398 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261402 | orchestrator | skipping: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-20 01:02:24.261407 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261411 | orchestrator | skipping: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-20 01:02:24.261416 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261420 | orchestrator | skipping: [testbed-node-3] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-20 01:02:24.261425 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261429 | orchestrator | skipping: [testbed-node-4] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-20 01:02:24.261435 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261441 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml) 2026-04-20 01:02:24.261445 | orchestrator | skipping: [testbed-node-5] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2026-04-20 01:02:24.261449 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261453 | orchestrator | 2026-04-20 01:02:24.261457 | orchestrator | TASK [prometheus : Find custom Alertmanager alert notification templates] ****** 2026-04-20 01:02:24.261461 | orchestrator | Monday 20 April 2026 01:02:01 +0000 (0:00:01.334) 0:00:42.267 ********** 2026-04-20 01:02:24.261464 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 01:02:24.261468 | orchestrator | 2026-04-20 01:02:24.261472 | orchestrator | TASK [prometheus : Copying over custom Alertmanager alert notification templates] *** 2026-04-20 01:02:24.261476 | orchestrator | Monday 20 April 2026 01:02:01 +0000 (0:00:00.686) 0:00:42.953 ********** 2026-04-20 01:02:24.261480 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.261483 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261487 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261491 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261494 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261537 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261542 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261546 | orchestrator | 2026-04-20 01:02:24.261550 | orchestrator | TASK [prometheus : Copying over my.cnf for mysqld_exporter] ******************** 2026-04-20 01:02:24.261557 | orchestrator | Monday 20 April 2026 01:02:02 +0000 (0:00:00.662) 0:00:43.615 ********** 2026-04-20 01:02:24.261561 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.261565 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261569 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261573 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261590 | orchestrator | changed: [testbed-node-0] 2026-04-20 01:02:24.261594 | orchestrator | changed: [testbed-node-1] 2026-04-20 01:02:24.261598 | orchestrator | changed: [testbed-node-2] 2026-04-20 01:02:24.261601 | orchestrator | 2026-04-20 01:02:24.261605 | orchestrator | TASK [prometheus : Copying cloud config file for openstack exporter] *********** 2026-04-20 01:02:24.261609 | orchestrator | Monday 20 April 2026 01:02:04 +0000 (0:00:01.639) 0:00:45.255 ********** 2026-04-20 01:02:24.261613 | orchestrator | skipping: [testbed-manager] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261617 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.261620 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261624 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261628 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261631 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261635 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261639 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261643 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261646 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261650 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261686 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261690 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2026-04-20 01:02:24.261694 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261698 | orchestrator | 2026-04-20 01:02:24.261702 | orchestrator | TASK [prometheus : Copying config file for blackbox exporter] ****************** 2026-04-20 01:02:24.261705 | orchestrator | Monday 20 April 2026 01:02:05 +0000 (0:00:01.103) 0:00:46.359 ********** 2026-04-20 01:02:24.261709 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-20 01:02:24.261713 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261717 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-20 01:02:24.261721 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261724 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-20 01:02:24.261728 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261732 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-20 01:02:24.261736 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261739 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-20 01:02:24.261743 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261747 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2) 2026-04-20 01:02:24.261751 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2026-04-20 01:02:24.261755 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261758 | orchestrator | 2026-04-20 01:02:24.261762 | orchestrator | TASK [prometheus : Find extra prometheus server config files] ****************** 2026-04-20 01:02:24.261769 | orchestrator | Monday 20 April 2026 01:02:06 +0000 (0:00:01.371) 0:00:47.730 ********** 2026-04-20 01:02:24.261774 | orchestrator | [WARNING]: Skipped 2026-04-20 01:02:24.261778 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/extras/' path 2026-04-20 01:02:24.261785 | orchestrator | due to this access issue: 2026-04-20 01:02:24.261789 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/extras/' is 2026-04-20 01:02:24.261797 | orchestrator | not a directory 2026-04-20 01:02:24.261801 | orchestrator | ok: [testbed-manager -> localhost] 2026-04-20 01:02:24.261805 | orchestrator | 2026-04-20 01:02:24.261809 | orchestrator | TASK [prometheus : Create subdirectories for extra config files] *************** 2026-04-20 01:02:24.261812 | orchestrator | Monday 20 April 2026 01:02:07 +0000 (0:00:01.027) 0:00:48.758 ********** 2026-04-20 01:02:24.261816 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.261820 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261824 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261828 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261831 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261835 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261839 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261843 | orchestrator | 2026-04-20 01:02:24.261846 | orchestrator | TASK [prometheus : Template extra prometheus server config files] ************** 2026-04-20 01:02:24.261850 | orchestrator | Monday 20 April 2026 01:02:08 +0000 (0:00:00.618) 0:00:49.376 ********** 2026-04-20 01:02:24.261854 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.261858 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.261862 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.261866 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.261869 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.261873 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.261877 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.261881 | orchestrator | 2026-04-20 01:02:24.261885 | orchestrator | TASK [service-check-containers : prometheus | Check containers] **************** 2026-04-20 01:02:24.261888 | orchestrator | Monday 20 April 2026 01:02:09 +0000 (0:00:00.722) 0:00:50.098 ********** 2026-04-20 01:02:24.261893 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}}) 2026-04-20 01:02:24.261897 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261902 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261909 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261918 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261923 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261927 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261931 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2026-04-20 01:02:24.261935 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.261939 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.261945 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.261949 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.261958 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.261962 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.261966 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.261970 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.261974 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.261981 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.261985 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.261994 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:24.261999 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.262003 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.262007 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.262047 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.262053 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.262057 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2026-04-20 01:02:24.262067 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.262071 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.262076 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2026-04-20 01:02:24.262080 | orchestrator | 2026-04-20 01:02:24.262084 | orchestrator | TASK [service-check-containers : prometheus | Notify handlers to restart containers] *** 2026-04-20 01:02:24.262087 | orchestrator | Monday 20 April 2026 01:02:12 +0000 (0:00:03.941) 0:00:54.040 ********** 2026-04-20 01:02:24.262091 | orchestrator | changed: [testbed-manager] => { 2026-04-20 01:02:24.262095 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262099 | orchestrator | } 2026-04-20 01:02:24.262103 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 01:02:24.262107 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262114 | orchestrator | } 2026-04-20 01:02:24.262118 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 01:02:24.262121 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262125 | orchestrator | } 2026-04-20 01:02:24.262129 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 01:02:24.262133 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262136 | orchestrator | } 2026-04-20 01:02:24.262140 | orchestrator | changed: [testbed-node-3] => { 2026-04-20 01:02:24.262144 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262148 | orchestrator | } 2026-04-20 01:02:24.262151 | orchestrator | changed: [testbed-node-4] => { 2026-04-20 01:02:24.262155 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262159 | orchestrator | } 2026-04-20 01:02:24.262163 | orchestrator | changed: [testbed-node-5] => { 2026-04-20 01:02:24.262166 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:24.262170 | orchestrator | } 2026-04-20 01:02:24.262174 | orchestrator | 2026-04-20 01:02:24.262178 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 01:02:24.262182 | orchestrator | Monday 20 April 2026 01:02:13 +0000 (0:00:00.732) 0:00:54.773 ********** 2026-04-20 01:02:24.262186 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-server:3.2.1.20260328', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_server:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True, 'backend_http_extra': ['option httpchk GET /-/ready HTTP/1.0', "http-check send hdr Authorization 'Basic aGFwcm94eTptdWVNaWV4aWUzYW5nb28wZnVjaGFod2VlUXVhaEpvbw=='"]}}}})  2026-04-20 01:02:24.262194 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262199 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262203 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-alertmanager:0.28.1.20260328', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True, 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:24.262210 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'cap_add': ['CAP_NET_RAW'], 'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-blackbox-exporter:0.25.0.20260328', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262214 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262218 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262222 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262231 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262236 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262240 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.262244 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262250 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262254 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262258 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262262 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262271 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262275 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-mysqld-exporter:0.16.0.20260328', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262279 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-memcached-exporter:0.15.0.20260328', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262287 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-elasticsearch-exporter:1.8.0.20260328', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2026-04-20 01:02:24.262295 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:24.262299 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:24.262303 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:24.262307 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262312 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262316 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262319 | orchestrator | skipping: [testbed-node-3] 2026-04-20 01:02:24.262328 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262340 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262348 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262357 | orchestrator | skipping: [testbed-node-4] 2026-04-20 01:02:24.262365 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-node-exporter:1.8.2.20260328', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2026-04-20 01:02:24.262372 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-cadvisor:0.49.2.20260328', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262379 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/prometheus-libvirt-exporter:2.2.0.20260328', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2026-04-20 01:02:24.262385 | orchestrator | skipping: [testbed-node-5] 2026-04-20 01:02:24.262391 | orchestrator | 2026-04-20 01:02:24.262397 | orchestrator | TASK [prometheus : Creating prometheus database user and setting permissions] *** 2026-04-20 01:02:24.262403 | orchestrator | Monday 20 April 2026 01:02:15 +0000 (0:00:01.659) 0:00:56.433 ********** 2026-04-20 01:02:24.262409 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2026-04-20 01:02:24.262416 | orchestrator | skipping: [testbed-manager] 2026-04-20 01:02:24.262422 | orchestrator | 2026-04-20 01:02:24.262428 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262435 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:01.011) 0:00:57.444 ********** 2026-04-20 01:02:24.262441 | orchestrator | 2026-04-20 01:02:24.262448 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262454 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.060) 0:00:57.505 ********** 2026-04-20 01:02:24.262461 | orchestrator | 2026-04-20 01:02:24.262467 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262474 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.185) 0:00:57.690 ********** 2026-04-20 01:02:24.262486 | orchestrator | 2026-04-20 01:02:24.262496 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262506 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.057) 0:00:57.748 ********** 2026-04-20 01:02:24.262513 | orchestrator | 2026-04-20 01:02:24.262520 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262527 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.057) 0:00:57.805 ********** 2026-04-20 01:02:24.262534 | orchestrator | 2026-04-20 01:02:24.262541 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262548 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.055) 0:00:57.861 ********** 2026-04-20 01:02:24.262555 | orchestrator | 2026-04-20 01:02:24.262560 | orchestrator | TASK [prometheus : Flush handlers] ********************************************* 2026-04-20 01:02:24.262565 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.059) 0:00:57.920 ********** 2026-04-20 01:02:24.262569 | orchestrator | 2026-04-20 01:02:24.262598 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-server container] ************* 2026-04-20 01:02:24.262605 | orchestrator | Monday 20 April 2026 01:02:16 +0000 (0:00:00.080) 0:00:58.001 ********** 2026-04-20 01:02:24.262610 | orchestrator | fatal: [testbed-manager]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=3.2.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-server\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_h6opm4st/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_h6opm4st/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_h6opm4st/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_h6opm4st/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=3.2.1.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-server: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-server not found\")\\n'"} 2026-04-20 01:02:24.262616 | orchestrator | 2026-04-20 01:02:24.262621 | orchestrator | RUNNING HANDLER [prometheus : Restart prometheus-node-exporter container] ****** 2026-04-20 01:02:24.262625 | orchestrator | Monday 20 April 2026 01:02:19 +0000 (0:00:02.360) 0:01:00.361 ********** 2026-04-20 01:02:24.262641 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_9_zv3o2u/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_9_zv3o2u/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_9_zv3o2u/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_9_zv3o2u/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-node-exporter not found\")\\n'"} 2026-04-20 01:02:24.262653 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_09tfdzhx/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_09tfdzhx/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_09tfdzhx/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_09tfdzhx/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-node-exporter not found\")\\n'"} 2026-04-20 01:02:24.262663 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_01xz74hy/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_01xz74hy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_01xz74hy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_01xz74hy/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-node-exporter not found\")\\n'"} 2026-04-20 01:02:24.262674 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_xfeudrzh/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_xfeudrzh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_xfeudrzh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_xfeudrzh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-node-exporter not found\")\\n'"} 2026-04-20 01:02:24.262684 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_xztbz1ff/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_xztbz1ff/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_xztbz1ff/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_xztbz1ff/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-node-exporter not found\")\\n'"} 2026-04-20 01:02:24.262697 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": true, "msg": "'Traceback (most recent call last):\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 275, in _raise_for_status\\n response.raise_for_status()\\n File \"/usr/lib/python3/dist-packages/requests/models.py\", line 1021, in raise_for_status\\n raise HTTPError(http_error_msg, response=self)\\nrequests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter\\n\\nThe above exception was the direct cause of the following exception:\\n\\nTraceback (most recent call last):\\n File \"/tmp/ansible_kolla_container_payload_uul39_mh/ansible_kolla_container_payload.zip/ansible/modules/kolla_container.py\", line 421, in main\\n result = bool(getattr(cw, module.params.get(\\'action\\'))())\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/tmp/ansible_kolla_container_payload_uul39_mh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 352, in recreate_or_restart_container\\n self.start_container()\\n File \"/tmp/ansible_kolla_container_payload_uul39_mh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 370, in start_container\\n self.pull_image()\\n File \"/tmp/ansible_kolla_container_payload_uul39_mh/ansible_kolla_container_payload.zip/ansible/module_utils/kolla_docker_worker.py\", line 202, in pull_image\\n json.loads(line.strip().decode(\\'utf-8\\')) for line in self.dc.pull(\\n ^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/api/image.py\", line 429, in pull\\n self._raise_for_status(response)\\n File \"/usr/lib/python3/dist-packages/docker/api/client.py\", line 277, in _raise_for_status\\n raise create_api_error_from_http_exception(e) from e\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \"/usr/lib/python3/dist-packages/docker/errors.py\", line 39, in create_api_error_from_http_exception\\n raise cls(e, response=response, explanation=explanation) from e\\ndocker.errors.APIError: 500 Server Error for http+docker://localhost/v1.47/images/create?tag=1.8.2.20260328&fromImage=registry.osism.tech%2Fkolla%2Frelease%2F2024.2%2Fprometheus-node-exporter: Internal Server Error (\"unknown: repository kolla/release/2024.2/prometheus-node-exporter not found\")\\n'"} 2026-04-20 01:02:24.262702 | orchestrator | 2026-04-20 01:02:24.262706 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 01:02:24.262711 | orchestrator | testbed-manager : ok=18  changed=9  unreachable=0 failed=1  skipped=10  rescued=0 ignored=0 2026-04-20 01:02:24.262717 | orchestrator | testbed-node-0 : ok=11  changed=6  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-20 01:02:24.262722 | orchestrator | testbed-node-1 : ok=11  changed=6  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-20 01:02:24.262727 | orchestrator | testbed-node-2 : ok=11  changed=6  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2026-04-20 01:02:24.262735 | orchestrator | testbed-node-3 : ok=10  changed=5  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-04-20 01:02:24.262739 | orchestrator | testbed-node-4 : ok=10  changed=5  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-04-20 01:02:24.262744 | orchestrator | testbed-node-5 : ok=10  changed=5  unreachable=0 failed=1  skipped=13  rescued=0 ignored=0 2026-04-20 01:02:24.262748 | orchestrator | 2026-04-20 01:02:24.262752 | orchestrator | 2026-04-20 01:02:24.262756 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 01:02:24.262760 | orchestrator | Monday 20 April 2026 01:02:23 +0000 (0:00:04.243) 0:01:04.605 ********** 2026-04-20 01:02:24.262763 | orchestrator | =============================================================================== 2026-04-20 01:02:24.262767 | orchestrator | prometheus : Copying over prometheus config file ----------------------- 11.87s 2026-04-20 01:02:24.262771 | orchestrator | prometheus : Copying over config.json files ----------------------------- 5.93s 2026-04-20 01:02:24.262777 | orchestrator | service-cert-copy : prometheus | Copying over extra CA certificates ----- 4.96s 2026-04-20 01:02:24.262783 | orchestrator | prometheus : Restart prometheus-node-exporter container ----------------- 4.24s 2026-04-20 01:02:24.262787 | orchestrator | service-check-containers : prometheus | Check containers ---------------- 3.94s 2026-04-20 01:02:24.262791 | orchestrator | prometheus : Ensuring config directories exist -------------------------- 3.37s 2026-04-20 01:02:24.262794 | orchestrator | prometheus : Copying over prometheus web config file -------------------- 3.10s 2026-04-20 01:02:24.262798 | orchestrator | prometheus : Restart prometheus-server container ------------------------ 2.36s 2026-04-20 01:02:24.262802 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS key --- 2.15s 2026-04-20 01:02:24.262806 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS certificate --- 1.76s 2026-04-20 01:02:24.262810 | orchestrator | service-check-containers : Include tasks -------------------------------- 1.66s 2026-04-20 01:02:24.262813 | orchestrator | prometheus : Copying over my.cnf for mysqld_exporter -------------------- 1.64s 2026-04-20 01:02:24.262817 | orchestrator | prometheus : Find prometheus host config overrides ---------------------- 1.49s 2026-04-20 01:02:24.262821 | orchestrator | prometheus : Copying config file for blackbox exporter ------------------ 1.37s 2026-04-20 01:02:24.262825 | orchestrator | prometheus : Copying over prometheus alertmanager config file ----------- 1.33s 2026-04-20 01:02:24.262828 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.14s 2026-04-20 01:02:24.262832 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.11s 2026-04-20 01:02:24.262836 | orchestrator | prometheus : Copying cloud config file for openstack exporter ----------- 1.10s 2026-04-20 01:02:24.262840 | orchestrator | prometheus : Find extra prometheus server config files ------------------ 1.03s 2026-04-20 01:02:24.262843 | orchestrator | prometheus : Creating prometheus database user and setting permissions --- 1.01s 2026-04-20 01:02:24.262847 | orchestrator | 2026-04-20 01:02:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:24.262851 | orchestrator | 2026-04-20 01:02:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:27.310537 | orchestrator | 2026-04-20 01:02:27 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:27.311660 | orchestrator | 2026-04-20 01:02:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:27.314393 | orchestrator | 2026-04-20 01:02:27 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state STARTED 2026-04-20 01:02:27.318313 | orchestrator | 2026-04-20 01:02:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:27.318403 | orchestrator | 2026-04-20 01:02:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:30.392158 | orchestrator | 2026-04-20 01:02:30 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:30.394006 | orchestrator | 2026-04-20 01:02:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:30.394934 | orchestrator | 2026-04-20 01:02:30 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state STARTED 2026-04-20 01:02:30.395659 | orchestrator | 2026-04-20 01:02:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:30.395705 | orchestrator | 2026-04-20 01:02:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:33.439831 | orchestrator | 2026-04-20 01:02:33 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:33.440862 | orchestrator | 2026-04-20 01:02:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:33.442900 | orchestrator | 2026-04-20 01:02:33 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state STARTED 2026-04-20 01:02:33.444018 | orchestrator | 2026-04-20 01:02:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:33.444069 | orchestrator | 2026-04-20 01:02:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:36.479093 | orchestrator | 2026-04-20 01:02:36 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:36.479930 | orchestrator | 2026-04-20 01:02:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:36.480846 | orchestrator | 2026-04-20 01:02:36 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state STARTED 2026-04-20 01:02:36.481786 | orchestrator | 2026-04-20 01:02:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:36.481825 | orchestrator | 2026-04-20 01:02:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:39.521707 | orchestrator | 2026-04-20 01:02:39 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:39.523054 | orchestrator | 2026-04-20 01:02:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:39.524095 | orchestrator | 2026-04-20 01:02:39 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state STARTED 2026-04-20 01:02:39.525978 | orchestrator | 2026-04-20 01:02:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:39.526167 | orchestrator | 2026-04-20 01:02:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:42.568230 | orchestrator | 2026-04-20 01:02:42 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:42.572648 | orchestrator | 2026-04-20 01:02:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:42.578984 | orchestrator | 2026-04-20 01:02:42 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state STARTED 2026-04-20 01:02:42.585703 | orchestrator | 2026-04-20 01:02:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:42.585766 | orchestrator | 2026-04-20 01:02:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:45.631944 | orchestrator | 2026-04-20 01:02:45 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:45.634197 | orchestrator | 2026-04-20 01:02:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:45.636380 | orchestrator | 2026-04-20 01:02:45 | INFO  | Task 697d75d7-b530-4670-979f-fedd4a50a857 is in state SUCCESS 2026-04-20 01:02:45.638483 | orchestrator | 2026-04-20 01:02:45.638531 | orchestrator | 2026-04-20 01:02:45.638538 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2026-04-20 01:02:45.638543 | orchestrator | 2026-04-20 01:02:45.638547 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2026-04-20 01:02:45.638551 | orchestrator | Monday 20 April 2026 01:02:26 +0000 (0:00:00.278) 0:00:00.278 ********** 2026-04-20 01:02:45.638555 | orchestrator | ok: [testbed-node-0] 2026-04-20 01:02:45.638560 | orchestrator | ok: [testbed-node-1] 2026-04-20 01:02:45.638578 | orchestrator | ok: [testbed-node-2] 2026-04-20 01:02:45.638584 | orchestrator | 2026-04-20 01:02:45.638590 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2026-04-20 01:02:45.638596 | orchestrator | Monday 20 April 2026 01:02:26 +0000 (0:00:00.276) 0:00:00.555 ********** 2026-04-20 01:02:45.638602 | orchestrator | ok: [testbed-node-0] => (item=enable_grafana_True) 2026-04-20 01:02:45.638607 | orchestrator | ok: [testbed-node-1] => (item=enable_grafana_True) 2026-04-20 01:02:45.638610 | orchestrator | ok: [testbed-node-2] => (item=enable_grafana_True) 2026-04-20 01:02:45.638614 | orchestrator | 2026-04-20 01:02:45.638618 | orchestrator | PLAY [Apply role grafana] ****************************************************** 2026-04-20 01:02:45.638632 | orchestrator | 2026-04-20 01:02:45.638636 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2026-04-20 01:02:45.638640 | orchestrator | Monday 20 April 2026 01:02:27 +0000 (0:00:00.274) 0:00:00.830 ********** 2026-04-20 01:02:45.638644 | orchestrator | included: /ansible/roles/grafana/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 01:02:45.638649 | orchestrator | 2026-04-20 01:02:45.638653 | orchestrator | TASK [grafana : Ensuring config directories exist] ***************************** 2026-04-20 01:02:45.638657 | orchestrator | Monday 20 April 2026 01:02:27 +0000 (0:00:00.539) 0:00:01.369 ********** 2026-04-20 01:02:45.638663 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638677 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638682 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638696 | orchestrator | 2026-04-20 01:02:45.638700 | orchestrator | TASK [grafana : Check if extra configuration file exists] ********************** 2026-04-20 01:02:45.638704 | orchestrator | Monday 20 April 2026 01:02:28 +0000 (0:00:00.986) 0:00:02.356 ********** 2026-04-20 01:02:45.638708 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 01:02:45.638712 | orchestrator | 2026-04-20 01:02:45.638716 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2026-04-20 01:02:45.638719 | orchestrator | Monday 20 April 2026 01:02:29 +0000 (0:00:00.770) 0:00:03.126 ********** 2026-04-20 01:02:45.638723 | orchestrator | included: /ansible/roles/grafana/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2026-04-20 01:02:45.638727 | orchestrator | 2026-04-20 01:02:45.638749 | orchestrator | TASK [service-cert-copy : grafana | Copying over extra CA certificates] ******** 2026-04-20 01:02:45.638757 | orchestrator | Monday 20 April 2026 01:02:29 +0000 (0:00:00.431) 0:00:03.557 ********** 2026-04-20 01:02:45.638764 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638771 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638777 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638783 | orchestrator | 2026-04-20 01:02:45.638789 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS certificate] *** 2026-04-20 01:02:45.638794 | orchestrator | Monday 20 April 2026 01:02:31 +0000 (0:00:01.498) 0:00:05.056 ********** 2026-04-20 01:02:45.638804 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.638817 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:45.638828 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.638836 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:45.638842 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.638849 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:45.638863 | orchestrator | 2026-04-20 01:02:45.638868 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS key] ***** 2026-04-20 01:02:45.638872 | orchestrator | Monday 20 April 2026 01:02:31 +0000 (0:00:00.391) 0:00:05.448 ********** 2026-04-20 01:02:45.638876 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.638880 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:45.638884 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.638893 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:45.638897 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.638901 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:45.638904 | orchestrator | 2026-04-20 01:02:45.638908 | orchestrator | TASK [grafana : Copying over config.json files] ******************************** 2026-04-20 01:02:45.638912 | orchestrator | Monday 20 April 2026 01:02:32 +0000 (0:00:00.537) 0:00:05.985 ********** 2026-04-20 01:02:45.638920 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638924 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638928 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638932 | orchestrator | 2026-04-20 01:02:45.638936 | orchestrator | TASK [grafana : Copying over grafana.ini] ************************************** 2026-04-20 01:02:45.638940 | orchestrator | Monday 20 April 2026 01:02:33 +0000 (0:00:01.218) 0:00:07.204 ********** 2026-04-20 01:02:45.638948 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638952 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638960 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.638970 | orchestrator | 2026-04-20 01:02:45.638974 | orchestrator | TASK [grafana : Copying over extra configuration file] ************************* 2026-04-20 01:02:45.638977 | orchestrator | Monday 20 April 2026 01:02:35 +0000 (0:00:01.447) 0:00:08.652 ********** 2026-04-20 01:02:45.638981 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:45.638985 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:45.638989 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:45.638992 | orchestrator | 2026-04-20 01:02:45.638996 | orchestrator | TASK [grafana : Configuring Prometheus as data source for Grafana] ************* 2026-04-20 01:02:45.639000 | orchestrator | Monday 20 April 2026 01:02:35 +0000 (0:00:00.230) 0:00:08.882 ********** 2026-04-20 01:02:45.639004 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-04-20 01:02:45.639008 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-04-20 01:02:45.639012 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2026-04-20 01:02:45.639016 | orchestrator | 2026-04-20 01:02:45.639022 | orchestrator | TASK [grafana : Configuring dashboards provisioning] *************************** 2026-04-20 01:02:45.639029 | orchestrator | Monday 20 April 2026 01:02:36 +0000 (0:00:01.075) 0:00:09.958 ********** 2026-04-20 01:02:45.639035 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-04-20 01:02:45.639042 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-04-20 01:02:45.639054 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2026-04-20 01:02:45.639061 | orchestrator | 2026-04-20 01:02:45.639068 | orchestrator | TASK [grafana : Check if the folder for custom grafana dashboards exists] ****** 2026-04-20 01:02:45.639075 | orchestrator | Monday 20 April 2026 01:02:37 +0000 (0:00:01.148) 0:00:11.106 ********** 2026-04-20 01:02:45.639079 | orchestrator | ok: [testbed-node-0 -> localhost] 2026-04-20 01:02:45.639083 | orchestrator | 2026-04-20 01:02:45.639087 | orchestrator | TASK [grafana : Remove templated Grafana dashboards] *************************** 2026-04-20 01:02:45.639091 | orchestrator | Monday 20 April 2026 01:02:38 +0000 (0:00:00.673) 0:00:11.780 ********** 2026-04-20 01:02:45.639094 | orchestrator | ok: [testbed-node-0] 2026-04-20 01:02:45.639098 | orchestrator | ok: [testbed-node-1] 2026-04-20 01:02:45.639102 | orchestrator | ok: [testbed-node-2] 2026-04-20 01:02:45.639106 | orchestrator | 2026-04-20 01:02:45.639111 | orchestrator | TASK [grafana : Copying over custom dashboards] ******************************** 2026-04-20 01:02:45.639115 | orchestrator | Monday 20 April 2026 01:02:38 +0000 (0:00:00.774) 0:00:12.555 ********** 2026-04-20 01:02:45.639120 | orchestrator | changed: [testbed-node-0] 2026-04-20 01:02:45.639124 | orchestrator | changed: [testbed-node-1] 2026-04-20 01:02:45.639128 | orchestrator | changed: [testbed-node-2] 2026-04-20 01:02:45.639132 | orchestrator | 2026-04-20 01:02:45.639137 | orchestrator | TASK [service-check-containers : grafana | Check containers] ******************* 2026-04-20 01:02:45.639141 | orchestrator | Monday 20 April 2026 01:02:40 +0000 (0:00:01.119) 0:00:13.674 ********** 2026-04-20 01:02:45.639151 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.639158 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.639169 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}}) 2026-04-20 01:02:45.639176 | orchestrator | 2026-04-20 01:02:45.639191 | orchestrator | TASK [service-check-containers : grafana | Notify handlers to restart containers] *** 2026-04-20 01:02:45.639199 | orchestrator | Monday 20 April 2026 01:02:40 +0000 (0:00:00.925) 0:00:14.599 ********** 2026-04-20 01:02:45.639204 | orchestrator | changed: [testbed-node-0] => { 2026-04-20 01:02:45.639211 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:45.639217 | orchestrator | } 2026-04-20 01:02:45.639223 | orchestrator | changed: [testbed-node-1] => { 2026-04-20 01:02:45.639229 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:45.639235 | orchestrator | } 2026-04-20 01:02:45.639242 | orchestrator | changed: [testbed-node-2] => { 2026-04-20 01:02:45.639248 | orchestrator |  "msg": "Notifying handlers" 2026-04-20 01:02:45.639255 | orchestrator | } 2026-04-20 01:02:45.639261 | orchestrator | 2026-04-20 01:02:45.639267 | orchestrator | TASK [service-check-containers : Include tasks] ******************************** 2026-04-20 01:02:45.639273 | orchestrator | Monday 20 April 2026 01:02:41 +0000 (0:00:00.267) 0:00:14.867 ********** 2026-04-20 01:02:45.639279 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.639289 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.639296 | orchestrator | skipping: [testbed-node-0] 2026-04-20 01:02:45.639302 | orchestrator | skipping: [testbed-node-1] 2026-04-20 01:02:45.639309 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/2024.2/grafana:12.4.2.20260328', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000', 'backend_http_extra': ['option httpchk']}}}})  2026-04-20 01:02:45.639315 | orchestrator | skipping: [testbed-node-2] 2026-04-20 01:02:45.639321 | orchestrator | 2026-04-20 01:02:45.639327 | orchestrator | TASK [grafana : Creating grafana database] ************************************* 2026-04-20 01:02:45.639334 | orchestrator | Monday 20 April 2026 01:02:41 +0000 (0:00:00.647) 0:00:15.514 ********** 2026-04-20 01:02:45.639346 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is missing or not running!"} 2026-04-20 01:02:45.639353 | orchestrator | 2026-04-20 01:02:45.639359 | orchestrator | PLAY RECAP ********************************************************************* 2026-04-20 01:02:45.639371 | orchestrator | testbed-node-0 : ok=16  changed=9  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2026-04-20 01:02:45.639379 | orchestrator | testbed-node-1 : ok=14  changed=9  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-04-20 01:02:45.639386 | orchestrator | testbed-node-2 : ok=14  changed=9  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2026-04-20 01:02:45.639391 | orchestrator | 2026-04-20 01:02:45.639398 | orchestrator | 2026-04-20 01:02:45.639405 | orchestrator | TASKS RECAP ******************************************************************** 2026-04-20 01:02:45.639411 | orchestrator | Monday 20 April 2026 01:02:42 +0000 (0:00:00.679) 0:00:16.193 ********** 2026-04-20 01:02:45.639417 | orchestrator | =============================================================================== 2026-04-20 01:02:45.639424 | orchestrator | service-cert-copy : grafana | Copying over extra CA certificates -------- 1.50s 2026-04-20 01:02:45.639431 | orchestrator | grafana : Copying over grafana.ini -------------------------------------- 1.45s 2026-04-20 01:02:45.639437 | orchestrator | grafana : Copying over config.json files -------------------------------- 1.22s 2026-04-20 01:02:45.639443 | orchestrator | grafana : Configuring dashboards provisioning --------------------------- 1.15s 2026-04-20 01:02:45.639449 | orchestrator | grafana : Copying over custom dashboards -------------------------------- 1.12s 2026-04-20 01:02:45.639455 | orchestrator | grafana : Configuring Prometheus as data source for Grafana ------------- 1.08s 2026-04-20 01:02:45.639461 | orchestrator | grafana : Ensuring config directories exist ----------------------------- 0.99s 2026-04-20 01:02:45.639468 | orchestrator | service-check-containers : grafana | Check containers ------------------- 0.93s 2026-04-20 01:02:45.639473 | orchestrator | grafana : Remove templated Grafana dashboards --------------------------- 0.77s 2026-04-20 01:02:45.639479 | orchestrator | grafana : Check if extra configuration file exists ---------------------- 0.77s 2026-04-20 01:02:45.639485 | orchestrator | grafana : Creating grafana database ------------------------------------- 0.68s 2026-04-20 01:02:45.639491 | orchestrator | grafana : Check if the folder for custom grafana dashboards exists ------ 0.67s 2026-04-20 01:02:45.639498 | orchestrator | service-check-containers : Include tasks -------------------------------- 0.65s 2026-04-20 01:02:45.639503 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.54s 2026-04-20 01:02:45.639509 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS key ----- 0.54s 2026-04-20 01:02:45.639515 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.43s 2026-04-20 01:02:45.639522 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS certificate --- 0.39s 2026-04-20 01:02:45.639528 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.28s 2026-04-20 01:02:45.639534 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.27s 2026-04-20 01:02:45.639540 | orchestrator | service-check-containers : grafana | Notify handlers to restart containers --- 0.27s 2026-04-20 01:02:45.639546 | orchestrator | 2026-04-20 01:02:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:45.639552 | orchestrator | 2026-04-20 01:02:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:48.681129 | orchestrator | 2026-04-20 01:02:48 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state STARTED 2026-04-20 01:02:48.682185 | orchestrator | 2026-04-20 01:02:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:48.684316 | orchestrator | 2026-04-20 01:02:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:48.684370 | orchestrator | 2026-04-20 01:02:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:51.732235 | orchestrator | 2026-04-20 01:02:51 | INFO  | Task acb3ad64-24dd-4281-bab9-e18d95e9e7b1 is in state SUCCESS 2026-04-20 01:02:51.733681 | orchestrator | 2026-04-20 01:02:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:51.736094 | orchestrator | 2026-04-20 01:02:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:51.736428 | orchestrator | 2026-04-20 01:02:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:54.777960 | orchestrator | 2026-04-20 01:02:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:54.779420 | orchestrator | 2026-04-20 01:02:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:54.779475 | orchestrator | 2026-04-20 01:02:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:02:57.821914 | orchestrator | 2026-04-20 01:02:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:02:57.823510 | orchestrator | 2026-04-20 01:02:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:02:57.823616 | orchestrator | 2026-04-20 01:02:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:00.868571 | orchestrator | 2026-04-20 01:03:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:00.869612 | orchestrator | 2026-04-20 01:03:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:00.869652 | orchestrator | 2026-04-20 01:03:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:03.905696 | orchestrator | 2026-04-20 01:03:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:03.910218 | orchestrator | 2026-04-20 01:03:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:03.910266 | orchestrator | 2026-04-20 01:03:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:06.949436 | orchestrator | 2026-04-20 01:03:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:06.951502 | orchestrator | 2026-04-20 01:03:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:06.951572 | orchestrator | 2026-04-20 01:03:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:09.994859 | orchestrator | 2026-04-20 01:03:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:09.997783 | orchestrator | 2026-04-20 01:03:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:09.997827 | orchestrator | 2026-04-20 01:03:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:13.042615 | orchestrator | 2026-04-20 01:03:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:13.045881 | orchestrator | 2026-04-20 01:03:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:13.045940 | orchestrator | 2026-04-20 01:03:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:16.085738 | orchestrator | 2026-04-20 01:03:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:16.087814 | orchestrator | 2026-04-20 01:03:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:16.087862 | orchestrator | 2026-04-20 01:03:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:19.131283 | orchestrator | 2026-04-20 01:03:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:19.133096 | orchestrator | 2026-04-20 01:03:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:19.133139 | orchestrator | 2026-04-20 01:03:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:22.177674 | orchestrator | 2026-04-20 01:03:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:22.179072 | orchestrator | 2026-04-20 01:03:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:22.179123 | orchestrator | 2026-04-20 01:03:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:25.217864 | orchestrator | 2026-04-20 01:03:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:25.218155 | orchestrator | 2026-04-20 01:03:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:25.218179 | orchestrator | 2026-04-20 01:03:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:28.255130 | orchestrator | 2026-04-20 01:03:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:28.256387 | orchestrator | 2026-04-20 01:03:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:28.256437 | orchestrator | 2026-04-20 01:03:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:31.295387 | orchestrator | 2026-04-20 01:03:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:31.297295 | orchestrator | 2026-04-20 01:03:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:31.297348 | orchestrator | 2026-04-20 01:03:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:34.331798 | orchestrator | 2026-04-20 01:03:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:34.332595 | orchestrator | 2026-04-20 01:03:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:34.332632 | orchestrator | 2026-04-20 01:03:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:37.375063 | orchestrator | 2026-04-20 01:03:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:37.375709 | orchestrator | 2026-04-20 01:03:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:37.375747 | orchestrator | 2026-04-20 01:03:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:40.424890 | orchestrator | 2026-04-20 01:03:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:40.426629 | orchestrator | 2026-04-20 01:03:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:40.426688 | orchestrator | 2026-04-20 01:03:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:43.476155 | orchestrator | 2026-04-20 01:03:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:43.477763 | orchestrator | 2026-04-20 01:03:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:43.477810 | orchestrator | 2026-04-20 01:03:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:46.524302 | orchestrator | 2026-04-20 01:03:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:46.526499 | orchestrator | 2026-04-20 01:03:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:46.526792 | orchestrator | 2026-04-20 01:03:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:49.567213 | orchestrator | 2026-04-20 01:03:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:49.568146 | orchestrator | 2026-04-20 01:03:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:49.568187 | orchestrator | 2026-04-20 01:03:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:52.615987 | orchestrator | 2026-04-20 01:03:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:52.618257 | orchestrator | 2026-04-20 01:03:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:52.618316 | orchestrator | 2026-04-20 01:03:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:55.662756 | orchestrator | 2026-04-20 01:03:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:55.663290 | orchestrator | 2026-04-20 01:03:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:55.663328 | orchestrator | 2026-04-20 01:03:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:03:58.707075 | orchestrator | 2026-04-20 01:03:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:03:58.708742 | orchestrator | 2026-04-20 01:03:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:03:58.708812 | orchestrator | 2026-04-20 01:03:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:01.752825 | orchestrator | 2026-04-20 01:04:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:01.754728 | orchestrator | 2026-04-20 01:04:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:01.754784 | orchestrator | 2026-04-20 01:04:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:04.800571 | orchestrator | 2026-04-20 01:04:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:04.803672 | orchestrator | 2026-04-20 01:04:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:04.803723 | orchestrator | 2026-04-20 01:04:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:07.852189 | orchestrator | 2026-04-20 01:04:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:07.854933 | orchestrator | 2026-04-20 01:04:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:07.854984 | orchestrator | 2026-04-20 01:04:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:10.902732 | orchestrator | 2026-04-20 01:04:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:10.903944 | orchestrator | 2026-04-20 01:04:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:10.903974 | orchestrator | 2026-04-20 01:04:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:13.945260 | orchestrator | 2026-04-20 01:04:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:13.946154 | orchestrator | 2026-04-20 01:04:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:13.946197 | orchestrator | 2026-04-20 01:04:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:16.983887 | orchestrator | 2026-04-20 01:04:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:16.984687 | orchestrator | 2026-04-20 01:04:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:16.984748 | orchestrator | 2026-04-20 01:04:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:20.027708 | orchestrator | 2026-04-20 01:04:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:20.034079 | orchestrator | 2026-04-20 01:04:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:20.034137 | orchestrator | 2026-04-20 01:04:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:23.081147 | orchestrator | 2026-04-20 01:04:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:23.082953 | orchestrator | 2026-04-20 01:04:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:23.083005 | orchestrator | 2026-04-20 01:04:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:26.128738 | orchestrator | 2026-04-20 01:04:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:26.130602 | orchestrator | 2026-04-20 01:04:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:26.130666 | orchestrator | 2026-04-20 01:04:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:29.170950 | orchestrator | 2026-04-20 01:04:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:29.171615 | orchestrator | 2026-04-20 01:04:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:29.171827 | orchestrator | 2026-04-20 01:04:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:32.218619 | orchestrator | 2026-04-20 01:04:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:32.221697 | orchestrator | 2026-04-20 01:04:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:32.221766 | orchestrator | 2026-04-20 01:04:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:35.265776 | orchestrator | 2026-04-20 01:04:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:35.267596 | orchestrator | 2026-04-20 01:04:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:35.267659 | orchestrator | 2026-04-20 01:04:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:38.309830 | orchestrator | 2026-04-20 01:04:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:38.311778 | orchestrator | 2026-04-20 01:04:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:38.311823 | orchestrator | 2026-04-20 01:04:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:41.359154 | orchestrator | 2026-04-20 01:04:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:41.361764 | orchestrator | 2026-04-20 01:04:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:41.361822 | orchestrator | 2026-04-20 01:04:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:44.409235 | orchestrator | 2026-04-20 01:04:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:44.411733 | orchestrator | 2026-04-20 01:04:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:44.411783 | orchestrator | 2026-04-20 01:04:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:47.462366 | orchestrator | 2026-04-20 01:04:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:47.464681 | orchestrator | 2026-04-20 01:04:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:47.464727 | orchestrator | 2026-04-20 01:04:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:50.511381 | orchestrator | 2026-04-20 01:04:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:50.514180 | orchestrator | 2026-04-20 01:04:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:50.514274 | orchestrator | 2026-04-20 01:04:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:53.561375 | orchestrator | 2026-04-20 01:04:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:53.566078 | orchestrator | 2026-04-20 01:04:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:53.566126 | orchestrator | 2026-04-20 01:04:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:56.618130 | orchestrator | 2026-04-20 01:04:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:56.621501 | orchestrator | 2026-04-20 01:04:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:56.621588 | orchestrator | 2026-04-20 01:04:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:04:59.670984 | orchestrator | 2026-04-20 01:04:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:04:59.672610 | orchestrator | 2026-04-20 01:04:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:04:59.672779 | orchestrator | 2026-04-20 01:04:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:02.720869 | orchestrator | 2026-04-20 01:05:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:02.723654 | orchestrator | 2026-04-20 01:05:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:02.723782 | orchestrator | 2026-04-20 01:05:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:05.769321 | orchestrator | 2026-04-20 01:05:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:05.770751 | orchestrator | 2026-04-20 01:05:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:05.770870 | orchestrator | 2026-04-20 01:05:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:08.819713 | orchestrator | 2026-04-20 01:05:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:08.821417 | orchestrator | 2026-04-20 01:05:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:08.821472 | orchestrator | 2026-04-20 01:05:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:11.867471 | orchestrator | 2026-04-20 01:05:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:11.867522 | orchestrator | 2026-04-20 01:05:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:11.867528 | orchestrator | 2026-04-20 01:05:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:14.915823 | orchestrator | 2026-04-20 01:05:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:14.917871 | orchestrator | 2026-04-20 01:05:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:14.917951 | orchestrator | 2026-04-20 01:05:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:17.965218 | orchestrator | 2026-04-20 01:05:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:17.967557 | orchestrator | 2026-04-20 01:05:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:17.968331 | orchestrator | 2026-04-20 01:05:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:21.020552 | orchestrator | 2026-04-20 01:05:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:21.022728 | orchestrator | 2026-04-20 01:05:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:21.022803 | orchestrator | 2026-04-20 01:05:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:24.067243 | orchestrator | 2026-04-20 01:05:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:24.069356 | orchestrator | 2026-04-20 01:05:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:24.069469 | orchestrator | 2026-04-20 01:05:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:27.111315 | orchestrator | 2026-04-20 01:05:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:27.112781 | orchestrator | 2026-04-20 01:05:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:27.112856 | orchestrator | 2026-04-20 01:05:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:30.152616 | orchestrator | 2026-04-20 01:05:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:30.153722 | orchestrator | 2026-04-20 01:05:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:30.153764 | orchestrator | 2026-04-20 01:05:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:33.196944 | orchestrator | 2026-04-20 01:05:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:33.199124 | orchestrator | 2026-04-20 01:05:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:33.199170 | orchestrator | 2026-04-20 01:05:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:36.242230 | orchestrator | 2026-04-20 01:05:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:36.244082 | orchestrator | 2026-04-20 01:05:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:36.244127 | orchestrator | 2026-04-20 01:05:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:39.287558 | orchestrator | 2026-04-20 01:05:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:39.288556 | orchestrator | 2026-04-20 01:05:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:39.288731 | orchestrator | 2026-04-20 01:05:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:42.335328 | orchestrator | 2026-04-20 01:05:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:42.337508 | orchestrator | 2026-04-20 01:05:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:42.337591 | orchestrator | 2026-04-20 01:05:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:45.388621 | orchestrator | 2026-04-20 01:05:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:45.389954 | orchestrator | 2026-04-20 01:05:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:45.390011 | orchestrator | 2026-04-20 01:05:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:48.438723 | orchestrator | 2026-04-20 01:05:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:48.440459 | orchestrator | 2026-04-20 01:05:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:48.440524 | orchestrator | 2026-04-20 01:05:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:51.485004 | orchestrator | 2026-04-20 01:05:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:51.486676 | orchestrator | 2026-04-20 01:05:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:51.486967 | orchestrator | 2026-04-20 01:05:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:54.527828 | orchestrator | 2026-04-20 01:05:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:54.529367 | orchestrator | 2026-04-20 01:05:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:54.529630 | orchestrator | 2026-04-20 01:05:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:05:57.576744 | orchestrator | 2026-04-20 01:05:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:05:57.578062 | orchestrator | 2026-04-20 01:05:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:05:57.578116 | orchestrator | 2026-04-20 01:05:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:00.613666 | orchestrator | 2026-04-20 01:06:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:00.615881 | orchestrator | 2026-04-20 01:06:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:00.615960 | orchestrator | 2026-04-20 01:06:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:03.654578 | orchestrator | 2026-04-20 01:06:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:03.656083 | orchestrator | 2026-04-20 01:06:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:03.656143 | orchestrator | 2026-04-20 01:06:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:06.698318 | orchestrator | 2026-04-20 01:06:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:06.699994 | orchestrator | 2026-04-20 01:06:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:06.700053 | orchestrator | 2026-04-20 01:06:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:09.740651 | orchestrator | 2026-04-20 01:06:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:09.742115 | orchestrator | 2026-04-20 01:06:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:09.742171 | orchestrator | 2026-04-20 01:06:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:12.783654 | orchestrator | 2026-04-20 01:06:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:12.785081 | orchestrator | 2026-04-20 01:06:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:12.785170 | orchestrator | 2026-04-20 01:06:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:15.826932 | orchestrator | 2026-04-20 01:06:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:15.827972 | orchestrator | 2026-04-20 01:06:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:15.828049 | orchestrator | 2026-04-20 01:06:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:18.871699 | orchestrator | 2026-04-20 01:06:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:18.873452 | orchestrator | 2026-04-20 01:06:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:18.873499 | orchestrator | 2026-04-20 01:06:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:21.925647 | orchestrator | 2026-04-20 01:06:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:21.927120 | orchestrator | 2026-04-20 01:06:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:21.927192 | orchestrator | 2026-04-20 01:06:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:24.967255 | orchestrator | 2026-04-20 01:06:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:24.969017 | orchestrator | 2026-04-20 01:06:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:24.969056 | orchestrator | 2026-04-20 01:06:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:28.020956 | orchestrator | 2026-04-20 01:06:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:28.022963 | orchestrator | 2026-04-20 01:06:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:28.023033 | orchestrator | 2026-04-20 01:06:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:31.061712 | orchestrator | 2026-04-20 01:06:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:31.064829 | orchestrator | 2026-04-20 01:06:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:31.064883 | orchestrator | 2026-04-20 01:06:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:34.109612 | orchestrator | 2026-04-20 01:06:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:34.111532 | orchestrator | 2026-04-20 01:06:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:34.111608 | orchestrator | 2026-04-20 01:06:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:37.155405 | orchestrator | 2026-04-20 01:06:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:37.157070 | orchestrator | 2026-04-20 01:06:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:37.157202 | orchestrator | 2026-04-20 01:06:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:40.208957 | orchestrator | 2026-04-20 01:06:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:40.210864 | orchestrator | 2026-04-20 01:06:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:40.210907 | orchestrator | 2026-04-20 01:06:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:43.257541 | orchestrator | 2026-04-20 01:06:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:43.258044 | orchestrator | 2026-04-20 01:06:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:43.258092 | orchestrator | 2026-04-20 01:06:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:46.307707 | orchestrator | 2026-04-20 01:06:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:46.310295 | orchestrator | 2026-04-20 01:06:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:46.310460 | orchestrator | 2026-04-20 01:06:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:49.353928 | orchestrator | 2026-04-20 01:06:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:49.355598 | orchestrator | 2026-04-20 01:06:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:49.355647 | orchestrator | 2026-04-20 01:06:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:52.394249 | orchestrator | 2026-04-20 01:06:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:52.394521 | orchestrator | 2026-04-20 01:06:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:52.394537 | orchestrator | 2026-04-20 01:06:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:55.439534 | orchestrator | 2026-04-20 01:06:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:55.440824 | orchestrator | 2026-04-20 01:06:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:55.440858 | orchestrator | 2026-04-20 01:06:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:06:58.489251 | orchestrator | 2026-04-20 01:06:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:06:58.491700 | orchestrator | 2026-04-20 01:06:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:06:58.492125 | orchestrator | 2026-04-20 01:06:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:01.530727 | orchestrator | 2026-04-20 01:07:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:01.532824 | orchestrator | 2026-04-20 01:07:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:01.532865 | orchestrator | 2026-04-20 01:07:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:04.575178 | orchestrator | 2026-04-20 01:07:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:04.577290 | orchestrator | 2026-04-20 01:07:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:04.577647 | orchestrator | 2026-04-20 01:07:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:07.625976 | orchestrator | 2026-04-20 01:07:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:07.627289 | orchestrator | 2026-04-20 01:07:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:07.627588 | orchestrator | 2026-04-20 01:07:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:10.670939 | orchestrator | 2026-04-20 01:07:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:10.672496 | orchestrator | 2026-04-20 01:07:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:10.672560 | orchestrator | 2026-04-20 01:07:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:13.715763 | orchestrator | 2026-04-20 01:07:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:13.716183 | orchestrator | 2026-04-20 01:07:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:13.716215 | orchestrator | 2026-04-20 01:07:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:16.758960 | orchestrator | 2026-04-20 01:07:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:16.762141 | orchestrator | 2026-04-20 01:07:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:16.762224 | orchestrator | 2026-04-20 01:07:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:19.805061 | orchestrator | 2026-04-20 01:07:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:19.806231 | orchestrator | 2026-04-20 01:07:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:19.806286 | orchestrator | 2026-04-20 01:07:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:22.853069 | orchestrator | 2026-04-20 01:07:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:22.854232 | orchestrator | 2026-04-20 01:07:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:22.854277 | orchestrator | 2026-04-20 01:07:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:25.893525 | orchestrator | 2026-04-20 01:07:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:25.894963 | orchestrator | 2026-04-20 01:07:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:25.894985 | orchestrator | 2026-04-20 01:07:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:28.936940 | orchestrator | 2026-04-20 01:07:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:28.938689 | orchestrator | 2026-04-20 01:07:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:28.938853 | orchestrator | 2026-04-20 01:07:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:31.985465 | orchestrator | 2026-04-20 01:07:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:31.987691 | orchestrator | 2026-04-20 01:07:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:31.987754 | orchestrator | 2026-04-20 01:07:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:35.035265 | orchestrator | 2026-04-20 01:07:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:35.037915 | orchestrator | 2026-04-20 01:07:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:35.037977 | orchestrator | 2026-04-20 01:07:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:38.079720 | orchestrator | 2026-04-20 01:07:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:38.081616 | orchestrator | 2026-04-20 01:07:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:38.081712 | orchestrator | 2026-04-20 01:07:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:41.135746 | orchestrator | 2026-04-20 01:07:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:41.137555 | orchestrator | 2026-04-20 01:07:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:41.137639 | orchestrator | 2026-04-20 01:07:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:44.175006 | orchestrator | 2026-04-20 01:07:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:44.176230 | orchestrator | 2026-04-20 01:07:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:44.176335 | orchestrator | 2026-04-20 01:07:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:47.228372 | orchestrator | 2026-04-20 01:07:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:47.230375 | orchestrator | 2026-04-20 01:07:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:47.230466 | orchestrator | 2026-04-20 01:07:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:50.275934 | orchestrator | 2026-04-20 01:07:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:50.278205 | orchestrator | 2026-04-20 01:07:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:50.278569 | orchestrator | 2026-04-20 01:07:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:53.325219 | orchestrator | 2026-04-20 01:07:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:53.327138 | orchestrator | 2026-04-20 01:07:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:53.327227 | orchestrator | 2026-04-20 01:07:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:56.372198 | orchestrator | 2026-04-20 01:07:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:56.376026 | orchestrator | 2026-04-20 01:07:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:56.376123 | orchestrator | 2026-04-20 01:07:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:07:59.415918 | orchestrator | 2026-04-20 01:07:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:07:59.418539 | orchestrator | 2026-04-20 01:07:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:07:59.418588 | orchestrator | 2026-04-20 01:07:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:02.468051 | orchestrator | 2026-04-20 01:08:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:02.471351 | orchestrator | 2026-04-20 01:08:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:02.471435 | orchestrator | 2026-04-20 01:08:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:05.513099 | orchestrator | 2026-04-20 01:08:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:05.515154 | orchestrator | 2026-04-20 01:08:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:05.515362 | orchestrator | 2026-04-20 01:08:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:08.561337 | orchestrator | 2026-04-20 01:08:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:08.562547 | orchestrator | 2026-04-20 01:08:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:08.562746 | orchestrator | 2026-04-20 01:08:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:11.610047 | orchestrator | 2026-04-20 01:08:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:11.611717 | orchestrator | 2026-04-20 01:08:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:11.611931 | orchestrator | 2026-04-20 01:08:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:14.660311 | orchestrator | 2026-04-20 01:08:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:14.661795 | orchestrator | 2026-04-20 01:08:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:14.661963 | orchestrator | 2026-04-20 01:08:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:17.714708 | orchestrator | 2026-04-20 01:08:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:17.716837 | orchestrator | 2026-04-20 01:08:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:17.716919 | orchestrator | 2026-04-20 01:08:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:20.765327 | orchestrator | 2026-04-20 01:08:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:20.767339 | orchestrator | 2026-04-20 01:08:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:20.767413 | orchestrator | 2026-04-20 01:08:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:23.810606 | orchestrator | 2026-04-20 01:08:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:23.811824 | orchestrator | 2026-04-20 01:08:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:23.811866 | orchestrator | 2026-04-20 01:08:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:26.858330 | orchestrator | 2026-04-20 01:08:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:26.859708 | orchestrator | 2026-04-20 01:08:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:26.859768 | orchestrator | 2026-04-20 01:08:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:29.904118 | orchestrator | 2026-04-20 01:08:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:29.906920 | orchestrator | 2026-04-20 01:08:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:29.906992 | orchestrator | 2026-04-20 01:08:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:32.948403 | orchestrator | 2026-04-20 01:08:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:32.950398 | orchestrator | 2026-04-20 01:08:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:32.950479 | orchestrator | 2026-04-20 01:08:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:35.997641 | orchestrator | 2026-04-20 01:08:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:35.999377 | orchestrator | 2026-04-20 01:08:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:35.999436 | orchestrator | 2026-04-20 01:08:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:39.049650 | orchestrator | 2026-04-20 01:08:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:39.053228 | orchestrator | 2026-04-20 01:08:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:39.053305 | orchestrator | 2026-04-20 01:08:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:42.096604 | orchestrator | 2026-04-20 01:08:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:42.097840 | orchestrator | 2026-04-20 01:08:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:42.097945 | orchestrator | 2026-04-20 01:08:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:45.131747 | orchestrator | 2026-04-20 01:08:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:45.132213 | orchestrator | 2026-04-20 01:08:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:45.132245 | orchestrator | 2026-04-20 01:08:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:48.179613 | orchestrator | 2026-04-20 01:08:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:48.181895 | orchestrator | 2026-04-20 01:08:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:48.182075 | orchestrator | 2026-04-20 01:08:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:51.224729 | orchestrator | 2026-04-20 01:08:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:51.226331 | orchestrator | 2026-04-20 01:08:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:51.226396 | orchestrator | 2026-04-20 01:08:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:54.272827 | orchestrator | 2026-04-20 01:08:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:54.275052 | orchestrator | 2026-04-20 01:08:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:54.275139 | orchestrator | 2026-04-20 01:08:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:08:57.319991 | orchestrator | 2026-04-20 01:08:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:08:57.321484 | orchestrator | 2026-04-20 01:08:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:08:57.321544 | orchestrator | 2026-04-20 01:08:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:00.363153 | orchestrator | 2026-04-20 01:09:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:00.364595 | orchestrator | 2026-04-20 01:09:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:00.364620 | orchestrator | 2026-04-20 01:09:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:03.409348 | orchestrator | 2026-04-20 01:09:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:03.410420 | orchestrator | 2026-04-20 01:09:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:03.410454 | orchestrator | 2026-04-20 01:09:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:06.459357 | orchestrator | 2026-04-20 01:09:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:06.461222 | orchestrator | 2026-04-20 01:09:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:06.461292 | orchestrator | 2026-04-20 01:09:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:09.510796 | orchestrator | 2026-04-20 01:09:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:09.511566 | orchestrator | 2026-04-20 01:09:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:09.511590 | orchestrator | 2026-04-20 01:09:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:12.557620 | orchestrator | 2026-04-20 01:09:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:12.559403 | orchestrator | 2026-04-20 01:09:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:12.559473 | orchestrator | 2026-04-20 01:09:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:15.606050 | orchestrator | 2026-04-20 01:09:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:15.608928 | orchestrator | 2026-04-20 01:09:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:15.609014 | orchestrator | 2026-04-20 01:09:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:18.649720 | orchestrator | 2026-04-20 01:09:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:18.650972 | orchestrator | 2026-04-20 01:09:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:18.651019 | orchestrator | 2026-04-20 01:09:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:21.702640 | orchestrator | 2026-04-20 01:09:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:21.704803 | orchestrator | 2026-04-20 01:09:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:21.704884 | orchestrator | 2026-04-20 01:09:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:24.750646 | orchestrator | 2026-04-20 01:09:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:24.752770 | orchestrator | 2026-04-20 01:09:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:24.752879 | orchestrator | 2026-04-20 01:09:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:27.795877 | orchestrator | 2026-04-20 01:09:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:27.797501 | orchestrator | 2026-04-20 01:09:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:27.797563 | orchestrator | 2026-04-20 01:09:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:30.846085 | orchestrator | 2026-04-20 01:09:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:30.847485 | orchestrator | 2026-04-20 01:09:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:30.847601 | orchestrator | 2026-04-20 01:09:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:33.891999 | orchestrator | 2026-04-20 01:09:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:33.893942 | orchestrator | 2026-04-20 01:09:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:33.894217 | orchestrator | 2026-04-20 01:09:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:36.940969 | orchestrator | 2026-04-20 01:09:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:36.942386 | orchestrator | 2026-04-20 01:09:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:36.942421 | orchestrator | 2026-04-20 01:09:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:39.986329 | orchestrator | 2026-04-20 01:09:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:39.988381 | orchestrator | 2026-04-20 01:09:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:39.988466 | orchestrator | 2026-04-20 01:09:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:43.035734 | orchestrator | 2026-04-20 01:09:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:43.038587 | orchestrator | 2026-04-20 01:09:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:43.038663 | orchestrator | 2026-04-20 01:09:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:46.083862 | orchestrator | 2026-04-20 01:09:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:46.086076 | orchestrator | 2026-04-20 01:09:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:46.086344 | orchestrator | 2026-04-20 01:09:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:49.132811 | orchestrator | 2026-04-20 01:09:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:49.134780 | orchestrator | 2026-04-20 01:09:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:49.134838 | orchestrator | 2026-04-20 01:09:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:52.177692 | orchestrator | 2026-04-20 01:09:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:52.179197 | orchestrator | 2026-04-20 01:09:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:52.179274 | orchestrator | 2026-04-20 01:09:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:55.224412 | orchestrator | 2026-04-20 01:09:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:55.225439 | orchestrator | 2026-04-20 01:09:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:55.225589 | orchestrator | 2026-04-20 01:09:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:09:58.262953 | orchestrator | 2026-04-20 01:09:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:09:58.264786 | orchestrator | 2026-04-20 01:09:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:09:58.264830 | orchestrator | 2026-04-20 01:09:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:01.309986 | orchestrator | 2026-04-20 01:10:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:01.312129 | orchestrator | 2026-04-20 01:10:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:01.312218 | orchestrator | 2026-04-20 01:10:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:04.362284 | orchestrator | 2026-04-20 01:10:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:04.364206 | orchestrator | 2026-04-20 01:10:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:04.364253 | orchestrator | 2026-04-20 01:10:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:07.409827 | orchestrator | 2026-04-20 01:10:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:07.411540 | orchestrator | 2026-04-20 01:10:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:07.411624 | orchestrator | 2026-04-20 01:10:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:10.455813 | orchestrator | 2026-04-20 01:10:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:10.457398 | orchestrator | 2026-04-20 01:10:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:10.457776 | orchestrator | 2026-04-20 01:10:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:13.511027 | orchestrator | 2026-04-20 01:10:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:13.513907 | orchestrator | 2026-04-20 01:10:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:13.514190 | orchestrator | 2026-04-20 01:10:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:16.563600 | orchestrator | 2026-04-20 01:10:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:16.565412 | orchestrator | 2026-04-20 01:10:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:16.565549 | orchestrator | 2026-04-20 01:10:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:19.610138 | orchestrator | 2026-04-20 01:10:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:19.611857 | orchestrator | 2026-04-20 01:10:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:19.611904 | orchestrator | 2026-04-20 01:10:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:22.660928 | orchestrator | 2026-04-20 01:10:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:22.663757 | orchestrator | 2026-04-20 01:10:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:22.663810 | orchestrator | 2026-04-20 01:10:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:25.711698 | orchestrator | 2026-04-20 01:10:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:25.713784 | orchestrator | 2026-04-20 01:10:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:25.713824 | orchestrator | 2026-04-20 01:10:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:28.765859 | orchestrator | 2026-04-20 01:10:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:28.768720 | orchestrator | 2026-04-20 01:10:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:28.768764 | orchestrator | 2026-04-20 01:10:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:31.812558 | orchestrator | 2026-04-20 01:10:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:31.814235 | orchestrator | 2026-04-20 01:10:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:31.814277 | orchestrator | 2026-04-20 01:10:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:34.856473 | orchestrator | 2026-04-20 01:10:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:34.859125 | orchestrator | 2026-04-20 01:10:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:34.859185 | orchestrator | 2026-04-20 01:10:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:37.907459 | orchestrator | 2026-04-20 01:10:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:37.909611 | orchestrator | 2026-04-20 01:10:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:37.909668 | orchestrator | 2026-04-20 01:10:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:40.956337 | orchestrator | 2026-04-20 01:10:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:40.958625 | orchestrator | 2026-04-20 01:10:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:40.958695 | orchestrator | 2026-04-20 01:10:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:44.003163 | orchestrator | 2026-04-20 01:10:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:44.005058 | orchestrator | 2026-04-20 01:10:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:44.005122 | orchestrator | 2026-04-20 01:10:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:47.048598 | orchestrator | 2026-04-20 01:10:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:47.050388 | orchestrator | 2026-04-20 01:10:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:47.050444 | orchestrator | 2026-04-20 01:10:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:50.092958 | orchestrator | 2026-04-20 01:10:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:50.094574 | orchestrator | 2026-04-20 01:10:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:50.095384 | orchestrator | 2026-04-20 01:10:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:53.151892 | orchestrator | 2026-04-20 01:10:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:53.152920 | orchestrator | 2026-04-20 01:10:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:53.152994 | orchestrator | 2026-04-20 01:10:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:56.200426 | orchestrator | 2026-04-20 01:10:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:56.201289 | orchestrator | 2026-04-20 01:10:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:56.201315 | orchestrator | 2026-04-20 01:10:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:10:59.245983 | orchestrator | 2026-04-20 01:10:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:10:59.247580 | orchestrator | 2026-04-20 01:10:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:10:59.247631 | orchestrator | 2026-04-20 01:10:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:02.299109 | orchestrator | 2026-04-20 01:11:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:02.301098 | orchestrator | 2026-04-20 01:11:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:02.301166 | orchestrator | 2026-04-20 01:11:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:05.344118 | orchestrator | 2026-04-20 01:11:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:05.345716 | orchestrator | 2026-04-20 01:11:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:05.345767 | orchestrator | 2026-04-20 01:11:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:08.391036 | orchestrator | 2026-04-20 01:11:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:08.392209 | orchestrator | 2026-04-20 01:11:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:08.392247 | orchestrator | 2026-04-20 01:11:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:11.439539 | orchestrator | 2026-04-20 01:11:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:11.440117 | orchestrator | 2026-04-20 01:11:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:11.440171 | orchestrator | 2026-04-20 01:11:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:14.491276 | orchestrator | 2026-04-20 01:11:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:14.493419 | orchestrator | 2026-04-20 01:11:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:14.493490 | orchestrator | 2026-04-20 01:11:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:17.534264 | orchestrator | 2026-04-20 01:11:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:17.536130 | orchestrator | 2026-04-20 01:11:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:17.536193 | orchestrator | 2026-04-20 01:11:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:20.580085 | orchestrator | 2026-04-20 01:11:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:20.581638 | orchestrator | 2026-04-20 01:11:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:20.581703 | orchestrator | 2026-04-20 01:11:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:23.629752 | orchestrator | 2026-04-20 01:11:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:23.630802 | orchestrator | 2026-04-20 01:11:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:23.630828 | orchestrator | 2026-04-20 01:11:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:26.677059 | orchestrator | 2026-04-20 01:11:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:26.678630 | orchestrator | 2026-04-20 01:11:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:26.678696 | orchestrator | 2026-04-20 01:11:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:29.722664 | orchestrator | 2026-04-20 01:11:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:29.724724 | orchestrator | 2026-04-20 01:11:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:29.724780 | orchestrator | 2026-04-20 01:11:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:32.764067 | orchestrator | 2026-04-20 01:11:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:32.767103 | orchestrator | 2026-04-20 01:11:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:32.767189 | orchestrator | 2026-04-20 01:11:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:35.811784 | orchestrator | 2026-04-20 01:11:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:35.813652 | orchestrator | 2026-04-20 01:11:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:35.813739 | orchestrator | 2026-04-20 01:11:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:38.867225 | orchestrator | 2026-04-20 01:11:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:38.870337 | orchestrator | 2026-04-20 01:11:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:38.870428 | orchestrator | 2026-04-20 01:11:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:41.916120 | orchestrator | 2026-04-20 01:11:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:41.917548 | orchestrator | 2026-04-20 01:11:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:41.917587 | orchestrator | 2026-04-20 01:11:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:44.962361 | orchestrator | 2026-04-20 01:11:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:44.964134 | orchestrator | 2026-04-20 01:11:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:44.964202 | orchestrator | 2026-04-20 01:11:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:48.014713 | orchestrator | 2026-04-20 01:11:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:48.016233 | orchestrator | 2026-04-20 01:11:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:48.016272 | orchestrator | 2026-04-20 01:11:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:51.056903 | orchestrator | 2026-04-20 01:11:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:51.057809 | orchestrator | 2026-04-20 01:11:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:51.058591 | orchestrator | 2026-04-20 01:11:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:54.104596 | orchestrator | 2026-04-20 01:11:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:54.105661 | orchestrator | 2026-04-20 01:11:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:54.105719 | orchestrator | 2026-04-20 01:11:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:11:57.148954 | orchestrator | 2026-04-20 01:11:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:11:57.150758 | orchestrator | 2026-04-20 01:11:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:11:57.150871 | orchestrator | 2026-04-20 01:11:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:00.195528 | orchestrator | 2026-04-20 01:12:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:00.197397 | orchestrator | 2026-04-20 01:12:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:00.197519 | orchestrator | 2026-04-20 01:12:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:03.245790 | orchestrator | 2026-04-20 01:12:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:03.248362 | orchestrator | 2026-04-20 01:12:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:03.248423 | orchestrator | 2026-04-20 01:12:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:06.291866 | orchestrator | 2026-04-20 01:12:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:06.293320 | orchestrator | 2026-04-20 01:12:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:06.293377 | orchestrator | 2026-04-20 01:12:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:09.347251 | orchestrator | 2026-04-20 01:12:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:09.349104 | orchestrator | 2026-04-20 01:12:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:09.349187 | orchestrator | 2026-04-20 01:12:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:12.388598 | orchestrator | 2026-04-20 01:12:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:12.390389 | orchestrator | 2026-04-20 01:12:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:12.390447 | orchestrator | 2026-04-20 01:12:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:15.435229 | orchestrator | 2026-04-20 01:12:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:15.437617 | orchestrator | 2026-04-20 01:12:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:15.437758 | orchestrator | 2026-04-20 01:12:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:18.488059 | orchestrator | 2026-04-20 01:12:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:18.490267 | orchestrator | 2026-04-20 01:12:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:18.490332 | orchestrator | 2026-04-20 01:12:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:21.535254 | orchestrator | 2026-04-20 01:12:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:21.538838 | orchestrator | 2026-04-20 01:12:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:21.538950 | orchestrator | 2026-04-20 01:12:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:24.589764 | orchestrator | 2026-04-20 01:12:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:24.593190 | orchestrator | 2026-04-20 01:12:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:24.593293 | orchestrator | 2026-04-20 01:12:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:27.627408 | orchestrator | 2026-04-20 01:12:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:27.629452 | orchestrator | 2026-04-20 01:12:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:27.629510 | orchestrator | 2026-04-20 01:12:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:30.677002 | orchestrator | 2026-04-20 01:12:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:30.679604 | orchestrator | 2026-04-20 01:12:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:30.679686 | orchestrator | 2026-04-20 01:12:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:33.724815 | orchestrator | 2026-04-20 01:12:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:33.725994 | orchestrator | 2026-04-20 01:12:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:33.726140 | orchestrator | 2026-04-20 01:12:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:36.774142 | orchestrator | 2026-04-20 01:12:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:36.775984 | orchestrator | 2026-04-20 01:12:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:36.776344 | orchestrator | 2026-04-20 01:12:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:39.828619 | orchestrator | 2026-04-20 01:12:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:39.829884 | orchestrator | 2026-04-20 01:12:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:39.829964 | orchestrator | 2026-04-20 01:12:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:42.872369 | orchestrator | 2026-04-20 01:12:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:42.873420 | orchestrator | 2026-04-20 01:12:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:42.873559 | orchestrator | 2026-04-20 01:12:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:45.919096 | orchestrator | 2026-04-20 01:12:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:45.920353 | orchestrator | 2026-04-20 01:12:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:45.920395 | orchestrator | 2026-04-20 01:12:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:48.963099 | orchestrator | 2026-04-20 01:12:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:48.964662 | orchestrator | 2026-04-20 01:12:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:48.964853 | orchestrator | 2026-04-20 01:12:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:52.014509 | orchestrator | 2026-04-20 01:12:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:52.014653 | orchestrator | 2026-04-20 01:12:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:52.015100 | orchestrator | 2026-04-20 01:12:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:55.065609 | orchestrator | 2026-04-20 01:12:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:55.067673 | orchestrator | 2026-04-20 01:12:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:55.067750 | orchestrator | 2026-04-20 01:12:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:12:58.106634 | orchestrator | 2026-04-20 01:12:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:12:58.107171 | orchestrator | 2026-04-20 01:12:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:12:58.107203 | orchestrator | 2026-04-20 01:12:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:01.152104 | orchestrator | 2026-04-20 01:13:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:01.153564 | orchestrator | 2026-04-20 01:13:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:01.153908 | orchestrator | 2026-04-20 01:13:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:04.197022 | orchestrator | 2026-04-20 01:13:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:04.199861 | orchestrator | 2026-04-20 01:13:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:04.199951 | orchestrator | 2026-04-20 01:13:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:07.242566 | orchestrator | 2026-04-20 01:13:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:07.244394 | orchestrator | 2026-04-20 01:13:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:07.244465 | orchestrator | 2026-04-20 01:13:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:10.294573 | orchestrator | 2026-04-20 01:13:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:10.295678 | orchestrator | 2026-04-20 01:13:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:10.295752 | orchestrator | 2026-04-20 01:13:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:13.341754 | orchestrator | 2026-04-20 01:13:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:13.342824 | orchestrator | 2026-04-20 01:13:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:13.343009 | orchestrator | 2026-04-20 01:13:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:16.390713 | orchestrator | 2026-04-20 01:13:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:16.392425 | orchestrator | 2026-04-20 01:13:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:16.392480 | orchestrator | 2026-04-20 01:13:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:19.439394 | orchestrator | 2026-04-20 01:13:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:19.441212 | orchestrator | 2026-04-20 01:13:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:19.441314 | orchestrator | 2026-04-20 01:13:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:22.495391 | orchestrator | 2026-04-20 01:13:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:22.497437 | orchestrator | 2026-04-20 01:13:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:22.498132 | orchestrator | 2026-04-20 01:13:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:25.545808 | orchestrator | 2026-04-20 01:13:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:25.546940 | orchestrator | 2026-04-20 01:13:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:25.547287 | orchestrator | 2026-04-20 01:13:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:28.591043 | orchestrator | 2026-04-20 01:13:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:28.592710 | orchestrator | 2026-04-20 01:13:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:28.592996 | orchestrator | 2026-04-20 01:13:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:31.641258 | orchestrator | 2026-04-20 01:13:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:31.643609 | orchestrator | 2026-04-20 01:13:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:31.643725 | orchestrator | 2026-04-20 01:13:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:34.691031 | orchestrator | 2026-04-20 01:13:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:34.692125 | orchestrator | 2026-04-20 01:13:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:34.692266 | orchestrator | 2026-04-20 01:13:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:37.733273 | orchestrator | 2026-04-20 01:13:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:37.735254 | orchestrator | 2026-04-20 01:13:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:37.735356 | orchestrator | 2026-04-20 01:13:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:40.771210 | orchestrator | 2026-04-20 01:13:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:40.773220 | orchestrator | 2026-04-20 01:13:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:40.773320 | orchestrator | 2026-04-20 01:13:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:43.818527 | orchestrator | 2026-04-20 01:13:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:43.820675 | orchestrator | 2026-04-20 01:13:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:43.820724 | orchestrator | 2026-04-20 01:13:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:46.864803 | orchestrator | 2026-04-20 01:13:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:46.867121 | orchestrator | 2026-04-20 01:13:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:46.867211 | orchestrator | 2026-04-20 01:13:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:49.910344 | orchestrator | 2026-04-20 01:13:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:49.911472 | orchestrator | 2026-04-20 01:13:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:49.911505 | orchestrator | 2026-04-20 01:13:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:52.954746 | orchestrator | 2026-04-20 01:13:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:52.956130 | orchestrator | 2026-04-20 01:13:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:52.956201 | orchestrator | 2026-04-20 01:13:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:55.998353 | orchestrator | 2026-04-20 01:13:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:55.999809 | orchestrator | 2026-04-20 01:13:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:55.999888 | orchestrator | 2026-04-20 01:13:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:13:59.041579 | orchestrator | 2026-04-20 01:13:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:13:59.043006 | orchestrator | 2026-04-20 01:13:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:13:59.043064 | orchestrator | 2026-04-20 01:13:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:02.082180 | orchestrator | 2026-04-20 01:14:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:02.084065 | orchestrator | 2026-04-20 01:14:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:02.084184 | orchestrator | 2026-04-20 01:14:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:05.131933 | orchestrator | 2026-04-20 01:14:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:05.134333 | orchestrator | 2026-04-20 01:14:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:05.134417 | orchestrator | 2026-04-20 01:14:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:08.181199 | orchestrator | 2026-04-20 01:14:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:08.182847 | orchestrator | 2026-04-20 01:14:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:08.182902 | orchestrator | 2026-04-20 01:14:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:11.229182 | orchestrator | 2026-04-20 01:14:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:11.230939 | orchestrator | 2026-04-20 01:14:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:11.231009 | orchestrator | 2026-04-20 01:14:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:14.279225 | orchestrator | 2026-04-20 01:14:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:14.281630 | orchestrator | 2026-04-20 01:14:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:14.281722 | orchestrator | 2026-04-20 01:14:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:17.324389 | orchestrator | 2026-04-20 01:14:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:17.326388 | orchestrator | 2026-04-20 01:14:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:17.326448 | orchestrator | 2026-04-20 01:14:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:20.369743 | orchestrator | 2026-04-20 01:14:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:20.370781 | orchestrator | 2026-04-20 01:14:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:20.370959 | orchestrator | 2026-04-20 01:14:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:23.422156 | orchestrator | 2026-04-20 01:14:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:23.424747 | orchestrator | 2026-04-20 01:14:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:23.424910 | orchestrator | 2026-04-20 01:14:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:26.472339 | orchestrator | 2026-04-20 01:14:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:26.475695 | orchestrator | 2026-04-20 01:14:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:26.475752 | orchestrator | 2026-04-20 01:14:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:29.528306 | orchestrator | 2026-04-20 01:14:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:29.531391 | orchestrator | 2026-04-20 01:14:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:29.531471 | orchestrator | 2026-04-20 01:14:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:32.575098 | orchestrator | 2026-04-20 01:14:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:32.577157 | orchestrator | 2026-04-20 01:14:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:32.577276 | orchestrator | 2026-04-20 01:14:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:35.618671 | orchestrator | 2026-04-20 01:14:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:35.620454 | orchestrator | 2026-04-20 01:14:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:35.620518 | orchestrator | 2026-04-20 01:14:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:38.667852 | orchestrator | 2026-04-20 01:14:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:38.670372 | orchestrator | 2026-04-20 01:14:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:38.670524 | orchestrator | 2026-04-20 01:14:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:41.713853 | orchestrator | 2026-04-20 01:14:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:41.715134 | orchestrator | 2026-04-20 01:14:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:41.715189 | orchestrator | 2026-04-20 01:14:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:44.764385 | orchestrator | 2026-04-20 01:14:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:44.766766 | orchestrator | 2026-04-20 01:14:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:44.767223 | orchestrator | 2026-04-20 01:14:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:47.812958 | orchestrator | 2026-04-20 01:14:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:47.814553 | orchestrator | 2026-04-20 01:14:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:47.814647 | orchestrator | 2026-04-20 01:14:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:50.862691 | orchestrator | 2026-04-20 01:14:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:50.864518 | orchestrator | 2026-04-20 01:14:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:50.864605 | orchestrator | 2026-04-20 01:14:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:53.911040 | orchestrator | 2026-04-20 01:14:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:53.913497 | orchestrator | 2026-04-20 01:14:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:53.913566 | orchestrator | 2026-04-20 01:14:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:14:56.958703 | orchestrator | 2026-04-20 01:14:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:14:56.961101 | orchestrator | 2026-04-20 01:14:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:14:56.961167 | orchestrator | 2026-04-20 01:14:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:00.011966 | orchestrator | 2026-04-20 01:15:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:00.015008 | orchestrator | 2026-04-20 01:15:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:00.015090 | orchestrator | 2026-04-20 01:15:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:03.060342 | orchestrator | 2026-04-20 01:15:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:03.061719 | orchestrator | 2026-04-20 01:15:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:03.061782 | orchestrator | 2026-04-20 01:15:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:06.106628 | orchestrator | 2026-04-20 01:15:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:06.109317 | orchestrator | 2026-04-20 01:15:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:06.109420 | orchestrator | 2026-04-20 01:15:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:09.154309 | orchestrator | 2026-04-20 01:15:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:09.155975 | orchestrator | 2026-04-20 01:15:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:09.156025 | orchestrator | 2026-04-20 01:15:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:12.201048 | orchestrator | 2026-04-20 01:15:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:12.203033 | orchestrator | 2026-04-20 01:15:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:12.203077 | orchestrator | 2026-04-20 01:15:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:15.248322 | orchestrator | 2026-04-20 01:15:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:15.250419 | orchestrator | 2026-04-20 01:15:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:15.250492 | orchestrator | 2026-04-20 01:15:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:18.292150 | orchestrator | 2026-04-20 01:15:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:18.295140 | orchestrator | 2026-04-20 01:15:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:18.295265 | orchestrator | 2026-04-20 01:15:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:21.335112 | orchestrator | 2026-04-20 01:15:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:21.336354 | orchestrator | 2026-04-20 01:15:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:21.336417 | orchestrator | 2026-04-20 01:15:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:24.383452 | orchestrator | 2026-04-20 01:15:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:24.385696 | orchestrator | 2026-04-20 01:15:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:24.385774 | orchestrator | 2026-04-20 01:15:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:27.427511 | orchestrator | 2026-04-20 01:15:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:27.429494 | orchestrator | 2026-04-20 01:15:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:27.429793 | orchestrator | 2026-04-20 01:15:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:30.475324 | orchestrator | 2026-04-20 01:15:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:30.476719 | orchestrator | 2026-04-20 01:15:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:30.476795 | orchestrator | 2026-04-20 01:15:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:33.521758 | orchestrator | 2026-04-20 01:15:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:33.524325 | orchestrator | 2026-04-20 01:15:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:33.524383 | orchestrator | 2026-04-20 01:15:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:36.573344 | orchestrator | 2026-04-20 01:15:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:36.575091 | orchestrator | 2026-04-20 01:15:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:36.575163 | orchestrator | 2026-04-20 01:15:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:39.621941 | orchestrator | 2026-04-20 01:15:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:39.623506 | orchestrator | 2026-04-20 01:15:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:39.623555 | orchestrator | 2026-04-20 01:15:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:42.665671 | orchestrator | 2026-04-20 01:15:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:42.667825 | orchestrator | 2026-04-20 01:15:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:42.667892 | orchestrator | 2026-04-20 01:15:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:45.709203 | orchestrator | 2026-04-20 01:15:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:45.711492 | orchestrator | 2026-04-20 01:15:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:45.711583 | orchestrator | 2026-04-20 01:15:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:48.754958 | orchestrator | 2026-04-20 01:15:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:48.757365 | orchestrator | 2026-04-20 01:15:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:48.757459 | orchestrator | 2026-04-20 01:15:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:51.801080 | orchestrator | 2026-04-20 01:15:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:51.803247 | orchestrator | 2026-04-20 01:15:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:51.803352 | orchestrator | 2026-04-20 01:15:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:54.850109 | orchestrator | 2026-04-20 01:15:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:54.852163 | orchestrator | 2026-04-20 01:15:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:54.852223 | orchestrator | 2026-04-20 01:15:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:15:57.895636 | orchestrator | 2026-04-20 01:15:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:15:57.897803 | orchestrator | 2026-04-20 01:15:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:15:57.897883 | orchestrator | 2026-04-20 01:15:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:00.942399 | orchestrator | 2026-04-20 01:16:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:00.942572 | orchestrator | 2026-04-20 01:16:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:00.942882 | orchestrator | 2026-04-20 01:16:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:03.991572 | orchestrator | 2026-04-20 01:16:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:03.993536 | orchestrator | 2026-04-20 01:16:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:03.993597 | orchestrator | 2026-04-20 01:16:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:07.044981 | orchestrator | 2026-04-20 01:16:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:07.046606 | orchestrator | 2026-04-20 01:16:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:07.046676 | orchestrator | 2026-04-20 01:16:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:10.095751 | orchestrator | 2026-04-20 01:16:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:10.097702 | orchestrator | 2026-04-20 01:16:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:10.097810 | orchestrator | 2026-04-20 01:16:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:13.139334 | orchestrator | 2026-04-20 01:16:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:13.140360 | orchestrator | 2026-04-20 01:16:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:13.140407 | orchestrator | 2026-04-20 01:16:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:16.183297 | orchestrator | 2026-04-20 01:16:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:16.183361 | orchestrator | 2026-04-20 01:16:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:16.183370 | orchestrator | 2026-04-20 01:16:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:19.231934 | orchestrator | 2026-04-20 01:16:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:19.234233 | orchestrator | 2026-04-20 01:16:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:19.234493 | orchestrator | 2026-04-20 01:16:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:22.277498 | orchestrator | 2026-04-20 01:16:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:22.279162 | orchestrator | 2026-04-20 01:16:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:22.279273 | orchestrator | 2026-04-20 01:16:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:25.324760 | orchestrator | 2026-04-20 01:16:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:25.327523 | orchestrator | 2026-04-20 01:16:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:25.327630 | orchestrator | 2026-04-20 01:16:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:28.376340 | orchestrator | 2026-04-20 01:16:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:28.377770 | orchestrator | 2026-04-20 01:16:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:28.377806 | orchestrator | 2026-04-20 01:16:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:31.429979 | orchestrator | 2026-04-20 01:16:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:31.431939 | orchestrator | 2026-04-20 01:16:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:31.431987 | orchestrator | 2026-04-20 01:16:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:34.476820 | orchestrator | 2026-04-20 01:16:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:34.477795 | orchestrator | 2026-04-20 01:16:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:34.477839 | orchestrator | 2026-04-20 01:16:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:37.524777 | orchestrator | 2026-04-20 01:16:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:37.526136 | orchestrator | 2026-04-20 01:16:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:37.526178 | orchestrator | 2026-04-20 01:16:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:40.569911 | orchestrator | 2026-04-20 01:16:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:40.571811 | orchestrator | 2026-04-20 01:16:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:40.571872 | orchestrator | 2026-04-20 01:16:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:43.616567 | orchestrator | 2026-04-20 01:16:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:43.618205 | orchestrator | 2026-04-20 01:16:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:43.618364 | orchestrator | 2026-04-20 01:16:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:46.663262 | orchestrator | 2026-04-20 01:16:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:46.664377 | orchestrator | 2026-04-20 01:16:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:46.664391 | orchestrator | 2026-04-20 01:16:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:49.709557 | orchestrator | 2026-04-20 01:16:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:49.710888 | orchestrator | 2026-04-20 01:16:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:49.710924 | orchestrator | 2026-04-20 01:16:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:52.762554 | orchestrator | 2026-04-20 01:16:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:52.764517 | orchestrator | 2026-04-20 01:16:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:52.764786 | orchestrator | 2026-04-20 01:16:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:55.814936 | orchestrator | 2026-04-20 01:16:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:55.816138 | orchestrator | 2026-04-20 01:16:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:55.816184 | orchestrator | 2026-04-20 01:16:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:16:58.861210 | orchestrator | 2026-04-20 01:16:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:16:58.862650 | orchestrator | 2026-04-20 01:16:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:16:58.862901 | orchestrator | 2026-04-20 01:16:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:01.912649 | orchestrator | 2026-04-20 01:17:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:01.914249 | orchestrator | 2026-04-20 01:17:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:01.914306 | orchestrator | 2026-04-20 01:17:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:04.963071 | orchestrator | 2026-04-20 01:17:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:04.963979 | orchestrator | 2026-04-20 01:17:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:04.964070 | orchestrator | 2026-04-20 01:17:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:08.009574 | orchestrator | 2026-04-20 01:17:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:08.011887 | orchestrator | 2026-04-20 01:17:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:08.011932 | orchestrator | 2026-04-20 01:17:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:11.053068 | orchestrator | 2026-04-20 01:17:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:11.054643 | orchestrator | 2026-04-20 01:17:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:11.054663 | orchestrator | 2026-04-20 01:17:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:14.096306 | orchestrator | 2026-04-20 01:17:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:14.097429 | orchestrator | 2026-04-20 01:17:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:14.097472 | orchestrator | 2026-04-20 01:17:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:17.149775 | orchestrator | 2026-04-20 01:17:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:17.151282 | orchestrator | 2026-04-20 01:17:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:17.151335 | orchestrator | 2026-04-20 01:17:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:20.203084 | orchestrator | 2026-04-20 01:17:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:20.205089 | orchestrator | 2026-04-20 01:17:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:20.205175 | orchestrator | 2026-04-20 01:17:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:23.248528 | orchestrator | 2026-04-20 01:17:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:23.250759 | orchestrator | 2026-04-20 01:17:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:23.250807 | orchestrator | 2026-04-20 01:17:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:26.299896 | orchestrator | 2026-04-20 01:17:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:26.302154 | orchestrator | 2026-04-20 01:17:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:26.302242 | orchestrator | 2026-04-20 01:17:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:29.345130 | orchestrator | 2026-04-20 01:17:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:29.346221 | orchestrator | 2026-04-20 01:17:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:29.346275 | orchestrator | 2026-04-20 01:17:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:32.390645 | orchestrator | 2026-04-20 01:17:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:32.392299 | orchestrator | 2026-04-20 01:17:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:32.392411 | orchestrator | 2026-04-20 01:17:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:35.441352 | orchestrator | 2026-04-20 01:17:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:35.443204 | orchestrator | 2026-04-20 01:17:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:35.443318 | orchestrator | 2026-04-20 01:17:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:38.487163 | orchestrator | 2026-04-20 01:17:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:38.489400 | orchestrator | 2026-04-20 01:17:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:38.489456 | orchestrator | 2026-04-20 01:17:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:41.532380 | orchestrator | 2026-04-20 01:17:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:41.534152 | orchestrator | 2026-04-20 01:17:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:41.534220 | orchestrator | 2026-04-20 01:17:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:44.577629 | orchestrator | 2026-04-20 01:17:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:44.578800 | orchestrator | 2026-04-20 01:17:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:44.578856 | orchestrator | 2026-04-20 01:17:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:47.627381 | orchestrator | 2026-04-20 01:17:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:47.630220 | orchestrator | 2026-04-20 01:17:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:47.630425 | orchestrator | 2026-04-20 01:17:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:50.679571 | orchestrator | 2026-04-20 01:17:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:50.682598 | orchestrator | 2026-04-20 01:17:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:50.682659 | orchestrator | 2026-04-20 01:17:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:53.722462 | orchestrator | 2026-04-20 01:17:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:53.723637 | orchestrator | 2026-04-20 01:17:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:53.723689 | orchestrator | 2026-04-20 01:17:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:56.777501 | orchestrator | 2026-04-20 01:17:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:56.779255 | orchestrator | 2026-04-20 01:17:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:56.779288 | orchestrator | 2026-04-20 01:17:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:17:59.822827 | orchestrator | 2026-04-20 01:17:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:17:59.824769 | orchestrator | 2026-04-20 01:17:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:17:59.824831 | orchestrator | 2026-04-20 01:17:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:02.876000 | orchestrator | 2026-04-20 01:18:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:02.878282 | orchestrator | 2026-04-20 01:18:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:02.878358 | orchestrator | 2026-04-20 01:18:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:05.927025 | orchestrator | 2026-04-20 01:18:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:05.929030 | orchestrator | 2026-04-20 01:18:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:05.931361 | orchestrator | 2026-04-20 01:18:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:08.975353 | orchestrator | 2026-04-20 01:18:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:08.977711 | orchestrator | 2026-04-20 01:18:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:08.977929 | orchestrator | 2026-04-20 01:18:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:12.023611 | orchestrator | 2026-04-20 01:18:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:12.025981 | orchestrator | 2026-04-20 01:18:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:12.026211 | orchestrator | 2026-04-20 01:18:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:15.072065 | orchestrator | 2026-04-20 01:18:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:15.072633 | orchestrator | 2026-04-20 01:18:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:15.072663 | orchestrator | 2026-04-20 01:18:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:18.115302 | orchestrator | 2026-04-20 01:18:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:18.116726 | orchestrator | 2026-04-20 01:18:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:18.116943 | orchestrator | 2026-04-20 01:18:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:21.161681 | orchestrator | 2026-04-20 01:18:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:21.162594 | orchestrator | 2026-04-20 01:18:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:21.162617 | orchestrator | 2026-04-20 01:18:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:24.206224 | orchestrator | 2026-04-20 01:18:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:24.207966 | orchestrator | 2026-04-20 01:18:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:24.208019 | orchestrator | 2026-04-20 01:18:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:27.254309 | orchestrator | 2026-04-20 01:18:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:27.256248 | orchestrator | 2026-04-20 01:18:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:27.256456 | orchestrator | 2026-04-20 01:18:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:30.297865 | orchestrator | 2026-04-20 01:18:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:30.299474 | orchestrator | 2026-04-20 01:18:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:30.299559 | orchestrator | 2026-04-20 01:18:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:33.347873 | orchestrator | 2026-04-20 01:18:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:33.349069 | orchestrator | 2026-04-20 01:18:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:33.349167 | orchestrator | 2026-04-20 01:18:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:36.398849 | orchestrator | 2026-04-20 01:18:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:36.401246 | orchestrator | 2026-04-20 01:18:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:36.401297 | orchestrator | 2026-04-20 01:18:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:39.447357 | orchestrator | 2026-04-20 01:18:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:39.449112 | orchestrator | 2026-04-20 01:18:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:39.449159 | orchestrator | 2026-04-20 01:18:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:42.501370 | orchestrator | 2026-04-20 01:18:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:42.504056 | orchestrator | 2026-04-20 01:18:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:42.504155 | orchestrator | 2026-04-20 01:18:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:45.550202 | orchestrator | 2026-04-20 01:18:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:45.551377 | orchestrator | 2026-04-20 01:18:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:45.551438 | orchestrator | 2026-04-20 01:18:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:48.593437 | orchestrator | 2026-04-20 01:18:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:48.594378 | orchestrator | 2026-04-20 01:18:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:48.594598 | orchestrator | 2026-04-20 01:18:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:51.637501 | orchestrator | 2026-04-20 01:18:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:51.638203 | orchestrator | 2026-04-20 01:18:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:51.638347 | orchestrator | 2026-04-20 01:18:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:54.677663 | orchestrator | 2026-04-20 01:18:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:54.678932 | orchestrator | 2026-04-20 01:18:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:54.679019 | orchestrator | 2026-04-20 01:18:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:18:57.727813 | orchestrator | 2026-04-20 01:18:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:18:57.729040 | orchestrator | 2026-04-20 01:18:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:18:57.729106 | orchestrator | 2026-04-20 01:18:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:00.777009 | orchestrator | 2026-04-20 01:19:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:00.778592 | orchestrator | 2026-04-20 01:19:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:00.778639 | orchestrator | 2026-04-20 01:19:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:03.829345 | orchestrator | 2026-04-20 01:19:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:03.833207 | orchestrator | 2026-04-20 01:19:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:03.833273 | orchestrator | 2026-04-20 01:19:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:06.889367 | orchestrator | 2026-04-20 01:19:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:06.890971 | orchestrator | 2026-04-20 01:19:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:06.891077 | orchestrator | 2026-04-20 01:19:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:09.930902 | orchestrator | 2026-04-20 01:19:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:09.932977 | orchestrator | 2026-04-20 01:19:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:09.933056 | orchestrator | 2026-04-20 01:19:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:12.974374 | orchestrator | 2026-04-20 01:19:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:12.976149 | orchestrator | 2026-04-20 01:19:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:12.976238 | orchestrator | 2026-04-20 01:19:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:16.025565 | orchestrator | 2026-04-20 01:19:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:16.027461 | orchestrator | 2026-04-20 01:19:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:16.027574 | orchestrator | 2026-04-20 01:19:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:19.078315 | orchestrator | 2026-04-20 01:19:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:19.080596 | orchestrator | 2026-04-20 01:19:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:19.080659 | orchestrator | 2026-04-20 01:19:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:22.126228 | orchestrator | 2026-04-20 01:19:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:22.127742 | orchestrator | 2026-04-20 01:19:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:22.127925 | orchestrator | 2026-04-20 01:19:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:25.173579 | orchestrator | 2026-04-20 01:19:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:25.177035 | orchestrator | 2026-04-20 01:19:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:25.177071 | orchestrator | 2026-04-20 01:19:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:28.221836 | orchestrator | 2026-04-20 01:19:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:28.222955 | orchestrator | 2026-04-20 01:19:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:28.222971 | orchestrator | 2026-04-20 01:19:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:31.269288 | orchestrator | 2026-04-20 01:19:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:31.271013 | orchestrator | 2026-04-20 01:19:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:31.271032 | orchestrator | 2026-04-20 01:19:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:34.319533 | orchestrator | 2026-04-20 01:19:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:34.321252 | orchestrator | 2026-04-20 01:19:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:34.321293 | orchestrator | 2026-04-20 01:19:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:37.373114 | orchestrator | 2026-04-20 01:19:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:37.374365 | orchestrator | 2026-04-20 01:19:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:37.374500 | orchestrator | 2026-04-20 01:19:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:40.420784 | orchestrator | 2026-04-20 01:19:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:40.422511 | orchestrator | 2026-04-20 01:19:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:40.422622 | orchestrator | 2026-04-20 01:19:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:43.481502 | orchestrator | 2026-04-20 01:19:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:43.484196 | orchestrator | 2026-04-20 01:19:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:43.484269 | orchestrator | 2026-04-20 01:19:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:46.530283 | orchestrator | 2026-04-20 01:19:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:46.531751 | orchestrator | 2026-04-20 01:19:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:46.531969 | orchestrator | 2026-04-20 01:19:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:49.579920 | orchestrator | 2026-04-20 01:19:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:49.582071 | orchestrator | 2026-04-20 01:19:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:49.582174 | orchestrator | 2026-04-20 01:19:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:52.634568 | orchestrator | 2026-04-20 01:19:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:52.635609 | orchestrator | 2026-04-20 01:19:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:52.636171 | orchestrator | 2026-04-20 01:19:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:55.681506 | orchestrator | 2026-04-20 01:19:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:55.684561 | orchestrator | 2026-04-20 01:19:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:55.684662 | orchestrator | 2026-04-20 01:19:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:19:58.735878 | orchestrator | 2026-04-20 01:19:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:19:58.737953 | orchestrator | 2026-04-20 01:19:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:19:58.738083 | orchestrator | 2026-04-20 01:19:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:01.899730 | orchestrator | 2026-04-20 01:22:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:01.899940 | orchestrator | 2026-04-20 01:22:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:01.900011 | orchestrator | 2026-04-20 01:22:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:04.939326 | orchestrator | 2026-04-20 01:22:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:04.940166 | orchestrator | 2026-04-20 01:22:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:04.940546 | orchestrator | 2026-04-20 01:22:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:07.985247 | orchestrator | 2026-04-20 01:22:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:07.987097 | orchestrator | 2026-04-20 01:22:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:07.987157 | orchestrator | 2026-04-20 01:22:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:11.031569 | orchestrator | 2026-04-20 01:22:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:11.033352 | orchestrator | 2026-04-20 01:22:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:11.033399 | orchestrator | 2026-04-20 01:22:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:14.075571 | orchestrator | 2026-04-20 01:22:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:14.077065 | orchestrator | 2026-04-20 01:22:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:14.077124 | orchestrator | 2026-04-20 01:22:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:17.115494 | orchestrator | 2026-04-20 01:22:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:17.116618 | orchestrator | 2026-04-20 01:22:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:17.116714 | orchestrator | 2026-04-20 01:22:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:20.160707 | orchestrator | 2026-04-20 01:22:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:20.162427 | orchestrator | 2026-04-20 01:22:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:20.162498 | orchestrator | 2026-04-20 01:22:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:23.208864 | orchestrator | 2026-04-20 01:22:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:23.210933 | orchestrator | 2026-04-20 01:22:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:23.211001 | orchestrator | 2026-04-20 01:22:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:26.253671 | orchestrator | 2026-04-20 01:22:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:26.255380 | orchestrator | 2026-04-20 01:22:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:26.255450 | orchestrator | 2026-04-20 01:22:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:29.297204 | orchestrator | 2026-04-20 01:22:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:29.299031 | orchestrator | 2026-04-20 01:22:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:29.299101 | orchestrator | 2026-04-20 01:22:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:32.344047 | orchestrator | 2026-04-20 01:22:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:32.345581 | orchestrator | 2026-04-20 01:22:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:32.345783 | orchestrator | 2026-04-20 01:22:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:35.381099 | orchestrator | 2026-04-20 01:22:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:35.382448 | orchestrator | 2026-04-20 01:22:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:35.382501 | orchestrator | 2026-04-20 01:22:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:38.424933 | orchestrator | 2026-04-20 01:22:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:38.426161 | orchestrator | 2026-04-20 01:22:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:38.426233 | orchestrator | 2026-04-20 01:22:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:41.470486 | orchestrator | 2026-04-20 01:22:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:41.474354 | orchestrator | 2026-04-20 01:22:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:41.475119 | orchestrator | 2026-04-20 01:22:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:44.516427 | orchestrator | 2026-04-20 01:22:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:44.518235 | orchestrator | 2026-04-20 01:22:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:44.518292 | orchestrator | 2026-04-20 01:22:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:47.562652 | orchestrator | 2026-04-20 01:22:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:47.564717 | orchestrator | 2026-04-20 01:22:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:47.564767 | orchestrator | 2026-04-20 01:22:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:50.604181 | orchestrator | 2026-04-20 01:22:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:50.606199 | orchestrator | 2026-04-20 01:22:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:50.606289 | orchestrator | 2026-04-20 01:22:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:53.655231 | orchestrator | 2026-04-20 01:22:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:53.657296 | orchestrator | 2026-04-20 01:22:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:53.657368 | orchestrator | 2026-04-20 01:22:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:56.701244 | orchestrator | 2026-04-20 01:22:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:56.703128 | orchestrator | 2026-04-20 01:22:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:56.703201 | orchestrator | 2026-04-20 01:22:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:22:59.745145 | orchestrator | 2026-04-20 01:22:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:22:59.746557 | orchestrator | 2026-04-20 01:22:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:22:59.746643 | orchestrator | 2026-04-20 01:22:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:02.791151 | orchestrator | 2026-04-20 01:23:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:02.792820 | orchestrator | 2026-04-20 01:23:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:02.792951 | orchestrator | 2026-04-20 01:23:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:05.834289 | orchestrator | 2026-04-20 01:23:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:05.835293 | orchestrator | 2026-04-20 01:23:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:05.835334 | orchestrator | 2026-04-20 01:23:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:08.876713 | orchestrator | 2026-04-20 01:23:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:08.877780 | orchestrator | 2026-04-20 01:23:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:08.877839 | orchestrator | 2026-04-20 01:23:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:11.910325 | orchestrator | 2026-04-20 01:23:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:11.911963 | orchestrator | 2026-04-20 01:23:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:11.912036 | orchestrator | 2026-04-20 01:23:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:14.957884 | orchestrator | 2026-04-20 01:23:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:14.959688 | orchestrator | 2026-04-20 01:23:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:14.959728 | orchestrator | 2026-04-20 01:23:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:18.002169 | orchestrator | 2026-04-20 01:23:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:18.005031 | orchestrator | 2026-04-20 01:23:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:18.005111 | orchestrator | 2026-04-20 01:23:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:21.049489 | orchestrator | 2026-04-20 01:23:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:21.051196 | orchestrator | 2026-04-20 01:23:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:21.051235 | orchestrator | 2026-04-20 01:23:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:24.099810 | orchestrator | 2026-04-20 01:23:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:24.101380 | orchestrator | 2026-04-20 01:23:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:24.101435 | orchestrator | 2026-04-20 01:23:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:27.146276 | orchestrator | 2026-04-20 01:23:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:27.147860 | orchestrator | 2026-04-20 01:23:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:27.148014 | orchestrator | 2026-04-20 01:23:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:30.191728 | orchestrator | 2026-04-20 01:23:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:30.193244 | orchestrator | 2026-04-20 01:23:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:30.193317 | orchestrator | 2026-04-20 01:23:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:33.229285 | orchestrator | 2026-04-20 01:23:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:33.230326 | orchestrator | 2026-04-20 01:23:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:33.230374 | orchestrator | 2026-04-20 01:23:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:36.270238 | orchestrator | 2026-04-20 01:23:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:36.273054 | orchestrator | 2026-04-20 01:23:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:36.273159 | orchestrator | 2026-04-20 01:23:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:39.318705 | orchestrator | 2026-04-20 01:23:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:39.320528 | orchestrator | 2026-04-20 01:23:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:39.320589 | orchestrator | 2026-04-20 01:23:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:42.363850 | orchestrator | 2026-04-20 01:23:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:42.366227 | orchestrator | 2026-04-20 01:23:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:42.366578 | orchestrator | 2026-04-20 01:23:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:45.406980 | orchestrator | 2026-04-20 01:23:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:45.408256 | orchestrator | 2026-04-20 01:23:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:45.408307 | orchestrator | 2026-04-20 01:23:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:48.450353 | orchestrator | 2026-04-20 01:23:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:48.452315 | orchestrator | 2026-04-20 01:23:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:48.452409 | orchestrator | 2026-04-20 01:23:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:51.494228 | orchestrator | 2026-04-20 01:23:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:51.495544 | orchestrator | 2026-04-20 01:23:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:51.496110 | orchestrator | 2026-04-20 01:23:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:54.541091 | orchestrator | 2026-04-20 01:23:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:54.546643 | orchestrator | 2026-04-20 01:23:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:54.546722 | orchestrator | 2026-04-20 01:23:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:23:57.585063 | orchestrator | 2026-04-20 01:23:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:23:57.586956 | orchestrator | 2026-04-20 01:23:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:23:57.587237 | orchestrator | 2026-04-20 01:23:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:00.627999 | orchestrator | 2026-04-20 01:24:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:00.630634 | orchestrator | 2026-04-20 01:24:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:00.631361 | orchestrator | 2026-04-20 01:24:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:03.671338 | orchestrator | 2026-04-20 01:24:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:03.674119 | orchestrator | 2026-04-20 01:24:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:03.674184 | orchestrator | 2026-04-20 01:24:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:06.724106 | orchestrator | 2026-04-20 01:24:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:06.726197 | orchestrator | 2026-04-20 01:24:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:06.726277 | orchestrator | 2026-04-20 01:24:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:09.770624 | orchestrator | 2026-04-20 01:24:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:09.772093 | orchestrator | 2026-04-20 01:24:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:09.772119 | orchestrator | 2026-04-20 01:24:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:12.813387 | orchestrator | 2026-04-20 01:24:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:12.815157 | orchestrator | 2026-04-20 01:24:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:12.815281 | orchestrator | 2026-04-20 01:24:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:15.858430 | orchestrator | 2026-04-20 01:24:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:15.861379 | orchestrator | 2026-04-20 01:24:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:15.861549 | orchestrator | 2026-04-20 01:24:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:18.905185 | orchestrator | 2026-04-20 01:24:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:18.908148 | orchestrator | 2026-04-20 01:24:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:18.908229 | orchestrator | 2026-04-20 01:24:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:21.953974 | orchestrator | 2026-04-20 01:24:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:21.956632 | orchestrator | 2026-04-20 01:24:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:21.956692 | orchestrator | 2026-04-20 01:24:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:24.992828 | orchestrator | 2026-04-20 01:24:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:24.994383 | orchestrator | 2026-04-20 01:24:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:24.994439 | orchestrator | 2026-04-20 01:24:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:28.042048 | orchestrator | 2026-04-20 01:24:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:28.044991 | orchestrator | 2026-04-20 01:24:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:28.045087 | orchestrator | 2026-04-20 01:24:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:31.086849 | orchestrator | 2026-04-20 01:24:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:31.088443 | orchestrator | 2026-04-20 01:24:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:31.088507 | orchestrator | 2026-04-20 01:24:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:34.127075 | orchestrator | 2026-04-20 01:24:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:34.129318 | orchestrator | 2026-04-20 01:24:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:34.129386 | orchestrator | 2026-04-20 01:24:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:37.169348 | orchestrator | 2026-04-20 01:24:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:37.171647 | orchestrator | 2026-04-20 01:24:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:37.171702 | orchestrator | 2026-04-20 01:24:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:40.214807 | orchestrator | 2026-04-20 01:24:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:40.217019 | orchestrator | 2026-04-20 01:24:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:40.217085 | orchestrator | 2026-04-20 01:24:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:43.256160 | orchestrator | 2026-04-20 01:24:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:43.258509 | orchestrator | 2026-04-20 01:24:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:43.258612 | orchestrator | 2026-04-20 01:24:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:46.297697 | orchestrator | 2026-04-20 01:24:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:46.299672 | orchestrator | 2026-04-20 01:24:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:46.299725 | orchestrator | 2026-04-20 01:24:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:49.342407 | orchestrator | 2026-04-20 01:24:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:49.344403 | orchestrator | 2026-04-20 01:24:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:49.344546 | orchestrator | 2026-04-20 01:24:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:52.384842 | orchestrator | 2026-04-20 01:24:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:52.386835 | orchestrator | 2026-04-20 01:24:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:52.386959 | orchestrator | 2026-04-20 01:24:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:55.433516 | orchestrator | 2026-04-20 01:24:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:55.435260 | orchestrator | 2026-04-20 01:24:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:55.435310 | orchestrator | 2026-04-20 01:24:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:24:58.479103 | orchestrator | 2026-04-20 01:24:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:24:58.481893 | orchestrator | 2026-04-20 01:24:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:24:58.482169 | orchestrator | 2026-04-20 01:24:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:01.524069 | orchestrator | 2026-04-20 01:25:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:01.526451 | orchestrator | 2026-04-20 01:25:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:01.526522 | orchestrator | 2026-04-20 01:25:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:04.570562 | orchestrator | 2026-04-20 01:25:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:04.572208 | orchestrator | 2026-04-20 01:25:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:04.572300 | orchestrator | 2026-04-20 01:25:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:07.618551 | orchestrator | 2026-04-20 01:25:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:07.619852 | orchestrator | 2026-04-20 01:25:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:07.620015 | orchestrator | 2026-04-20 01:25:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:10.663272 | orchestrator | 2026-04-20 01:25:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:10.665347 | orchestrator | 2026-04-20 01:25:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:10.665459 | orchestrator | 2026-04-20 01:25:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:13.708801 | orchestrator | 2026-04-20 01:25:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:13.711290 | orchestrator | 2026-04-20 01:25:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:13.711366 | orchestrator | 2026-04-20 01:25:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:16.750685 | orchestrator | 2026-04-20 01:25:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:16.750866 | orchestrator | 2026-04-20 01:25:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:16.750881 | orchestrator | 2026-04-20 01:25:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:19.788754 | orchestrator | 2026-04-20 01:25:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:19.791047 | orchestrator | 2026-04-20 01:25:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:19.791113 | orchestrator | 2026-04-20 01:25:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:22.835408 | orchestrator | 2026-04-20 01:25:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:22.837673 | orchestrator | 2026-04-20 01:25:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:22.837747 | orchestrator | 2026-04-20 01:25:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:25.884199 | orchestrator | 2026-04-20 01:25:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:25.885638 | orchestrator | 2026-04-20 01:25:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:25.885785 | orchestrator | 2026-04-20 01:25:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:28.928649 | orchestrator | 2026-04-20 01:25:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:28.930914 | orchestrator | 2026-04-20 01:25:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:28.931011 | orchestrator | 2026-04-20 01:25:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:31.975328 | orchestrator | 2026-04-20 01:25:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:31.976526 | orchestrator | 2026-04-20 01:25:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:31.976624 | orchestrator | 2026-04-20 01:25:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:35.019818 | orchestrator | 2026-04-20 01:25:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:35.022501 | orchestrator | 2026-04-20 01:25:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:35.022577 | orchestrator | 2026-04-20 01:25:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:38.060864 | orchestrator | 2026-04-20 01:25:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:38.062288 | orchestrator | 2026-04-20 01:25:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:38.062330 | orchestrator | 2026-04-20 01:25:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:41.110084 | orchestrator | 2026-04-20 01:25:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:41.111638 | orchestrator | 2026-04-20 01:25:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:41.111677 | orchestrator | 2026-04-20 01:25:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:44.153720 | orchestrator | 2026-04-20 01:25:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:44.155189 | orchestrator | 2026-04-20 01:25:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:44.155244 | orchestrator | 2026-04-20 01:25:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:47.201521 | orchestrator | 2026-04-20 01:25:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:47.201869 | orchestrator | 2026-04-20 01:25:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:47.201986 | orchestrator | 2026-04-20 01:25:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:50.246525 | orchestrator | 2026-04-20 01:25:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:50.249086 | orchestrator | 2026-04-20 01:25:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:50.249765 | orchestrator | 2026-04-20 01:25:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:53.293674 | orchestrator | 2026-04-20 01:25:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:53.295304 | orchestrator | 2026-04-20 01:25:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:53.295369 | orchestrator | 2026-04-20 01:25:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:56.340200 | orchestrator | 2026-04-20 01:25:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:56.341924 | orchestrator | 2026-04-20 01:25:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:56.342320 | orchestrator | 2026-04-20 01:25:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:25:59.385657 | orchestrator | 2026-04-20 01:25:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:25:59.387208 | orchestrator | 2026-04-20 01:25:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:25:59.387256 | orchestrator | 2026-04-20 01:25:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:02.426905 | orchestrator | 2026-04-20 01:26:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:02.430999 | orchestrator | 2026-04-20 01:26:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:02.431069 | orchestrator | 2026-04-20 01:26:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:05.474818 | orchestrator | 2026-04-20 01:26:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:05.478288 | orchestrator | 2026-04-20 01:26:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:05.478403 | orchestrator | 2026-04-20 01:26:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:08.517881 | orchestrator | 2026-04-20 01:26:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:08.520271 | orchestrator | 2026-04-20 01:26:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:08.520544 | orchestrator | 2026-04-20 01:26:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:11.565549 | orchestrator | 2026-04-20 01:26:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:11.566754 | orchestrator | 2026-04-20 01:26:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:11.566786 | orchestrator | 2026-04-20 01:26:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:14.612232 | orchestrator | 2026-04-20 01:26:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:14.613995 | orchestrator | 2026-04-20 01:26:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:14.614299 | orchestrator | 2026-04-20 01:26:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:17.653209 | orchestrator | 2026-04-20 01:26:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:17.654953 | orchestrator | 2026-04-20 01:26:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:17.655043 | orchestrator | 2026-04-20 01:26:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:20.693578 | orchestrator | 2026-04-20 01:26:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:20.694919 | orchestrator | 2026-04-20 01:26:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:20.694997 | orchestrator | 2026-04-20 01:26:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:23.735379 | orchestrator | 2026-04-20 01:26:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:23.736686 | orchestrator | 2026-04-20 01:26:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:23.736729 | orchestrator | 2026-04-20 01:26:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:26.776874 | orchestrator | 2026-04-20 01:26:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:26.778208 | orchestrator | 2026-04-20 01:26:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:26.778335 | orchestrator | 2026-04-20 01:26:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:29.817960 | orchestrator | 2026-04-20 01:26:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:29.819592 | orchestrator | 2026-04-20 01:26:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:29.819677 | orchestrator | 2026-04-20 01:26:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:32.866724 | orchestrator | 2026-04-20 01:26:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:32.869859 | orchestrator | 2026-04-20 01:26:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:32.870225 | orchestrator | 2026-04-20 01:26:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:35.917055 | orchestrator | 2026-04-20 01:26:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:35.919162 | orchestrator | 2026-04-20 01:26:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:35.919207 | orchestrator | 2026-04-20 01:26:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:38.963629 | orchestrator | 2026-04-20 01:26:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:38.967443 | orchestrator | 2026-04-20 01:26:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:38.967507 | orchestrator | 2026-04-20 01:26:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:42.012823 | orchestrator | 2026-04-20 01:26:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:42.015708 | orchestrator | 2026-04-20 01:26:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:42.015787 | orchestrator | 2026-04-20 01:26:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:45.060843 | orchestrator | 2026-04-20 01:26:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:45.062205 | orchestrator | 2026-04-20 01:26:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:45.062258 | orchestrator | 2026-04-20 01:26:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:48.105309 | orchestrator | 2026-04-20 01:26:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:48.106889 | orchestrator | 2026-04-20 01:26:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:48.106977 | orchestrator | 2026-04-20 01:26:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:51.140891 | orchestrator | 2026-04-20 01:26:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:51.142770 | orchestrator | 2026-04-20 01:26:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:51.142841 | orchestrator | 2026-04-20 01:26:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:54.186656 | orchestrator | 2026-04-20 01:26:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:54.188763 | orchestrator | 2026-04-20 01:26:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:54.189037 | orchestrator | 2026-04-20 01:26:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:26:57.235414 | orchestrator | 2026-04-20 01:26:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:26:57.236304 | orchestrator | 2026-04-20 01:26:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:26:57.236370 | orchestrator | 2026-04-20 01:26:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:00.279653 | orchestrator | 2026-04-20 01:27:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:00.279774 | orchestrator | 2026-04-20 01:27:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:00.279784 | orchestrator | 2026-04-20 01:27:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:03.323369 | orchestrator | 2026-04-20 01:27:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:03.324214 | orchestrator | 2026-04-20 01:27:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:03.324252 | orchestrator | 2026-04-20 01:27:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:06.367800 | orchestrator | 2026-04-20 01:27:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:06.369977 | orchestrator | 2026-04-20 01:27:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:06.370134 | orchestrator | 2026-04-20 01:27:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:09.410510 | orchestrator | 2026-04-20 01:27:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:09.412617 | orchestrator | 2026-04-20 01:27:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:09.412703 | orchestrator | 2026-04-20 01:27:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:12.456886 | orchestrator | 2026-04-20 01:27:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:12.459131 | orchestrator | 2026-04-20 01:27:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:12.459269 | orchestrator | 2026-04-20 01:27:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:15.497607 | orchestrator | 2026-04-20 01:27:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:15.498724 | orchestrator | 2026-04-20 01:27:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:15.498837 | orchestrator | 2026-04-20 01:27:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:18.541439 | orchestrator | 2026-04-20 01:27:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:18.543336 | orchestrator | 2026-04-20 01:27:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:18.543393 | orchestrator | 2026-04-20 01:27:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:21.585311 | orchestrator | 2026-04-20 01:27:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:21.587020 | orchestrator | 2026-04-20 01:27:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:21.587068 | orchestrator | 2026-04-20 01:27:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:24.636274 | orchestrator | 2026-04-20 01:27:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:24.636463 | orchestrator | 2026-04-20 01:27:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:24.636474 | orchestrator | 2026-04-20 01:27:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:27.684168 | orchestrator | 2026-04-20 01:27:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:27.687121 | orchestrator | 2026-04-20 01:27:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:27.687181 | orchestrator | 2026-04-20 01:27:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:30.733742 | orchestrator | 2026-04-20 01:27:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:30.736263 | orchestrator | 2026-04-20 01:27:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:30.736316 | orchestrator | 2026-04-20 01:27:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:33.786639 | orchestrator | 2026-04-20 01:27:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:33.788412 | orchestrator | 2026-04-20 01:27:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:33.788473 | orchestrator | 2026-04-20 01:27:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:36.833238 | orchestrator | 2026-04-20 01:27:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:36.835672 | orchestrator | 2026-04-20 01:27:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:36.835712 | orchestrator | 2026-04-20 01:27:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:39.881771 | orchestrator | 2026-04-20 01:27:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:39.883622 | orchestrator | 2026-04-20 01:27:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:39.883676 | orchestrator | 2026-04-20 01:27:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:42.926424 | orchestrator | 2026-04-20 01:27:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:42.927985 | orchestrator | 2026-04-20 01:27:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:42.928078 | orchestrator | 2026-04-20 01:27:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:45.970202 | orchestrator | 2026-04-20 01:27:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:45.971913 | orchestrator | 2026-04-20 01:27:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:45.972101 | orchestrator | 2026-04-20 01:27:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:49.016240 | orchestrator | 2026-04-20 01:27:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:49.018160 | orchestrator | 2026-04-20 01:27:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:49.018204 | orchestrator | 2026-04-20 01:27:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:52.063303 | orchestrator | 2026-04-20 01:27:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:52.064695 | orchestrator | 2026-04-20 01:27:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:52.064735 | orchestrator | 2026-04-20 01:27:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:55.113401 | orchestrator | 2026-04-20 01:27:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:55.113549 | orchestrator | 2026-04-20 01:27:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:55.113558 | orchestrator | 2026-04-20 01:27:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:27:58.166542 | orchestrator | 2026-04-20 01:27:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:27:58.168237 | orchestrator | 2026-04-20 01:27:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:27:58.168284 | orchestrator | 2026-04-20 01:27:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:01.213725 | orchestrator | 2026-04-20 01:28:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:01.215733 | orchestrator | 2026-04-20 01:28:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:01.215862 | orchestrator | 2026-04-20 01:28:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:04.260622 | orchestrator | 2026-04-20 01:28:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:04.263258 | orchestrator | 2026-04-20 01:28:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:04.263346 | orchestrator | 2026-04-20 01:28:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:07.311559 | orchestrator | 2026-04-20 01:28:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:07.313053 | orchestrator | 2026-04-20 01:28:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:07.313113 | orchestrator | 2026-04-20 01:28:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:10.358688 | orchestrator | 2026-04-20 01:28:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:10.359630 | orchestrator | 2026-04-20 01:28:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:10.359673 | orchestrator | 2026-04-20 01:28:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:13.407451 | orchestrator | 2026-04-20 01:28:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:13.408906 | orchestrator | 2026-04-20 01:28:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:13.409077 | orchestrator | 2026-04-20 01:28:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:16.450403 | orchestrator | 2026-04-20 01:28:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:16.452636 | orchestrator | 2026-04-20 01:28:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:16.452832 | orchestrator | 2026-04-20 01:28:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:19.490727 | orchestrator | 2026-04-20 01:28:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:19.493159 | orchestrator | 2026-04-20 01:28:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:19.493755 | orchestrator | 2026-04-20 01:28:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:22.538640 | orchestrator | 2026-04-20 01:28:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:22.541259 | orchestrator | 2026-04-20 01:28:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:22.541319 | orchestrator | 2026-04-20 01:28:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:25.584102 | orchestrator | 2026-04-20 01:28:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:25.585938 | orchestrator | 2026-04-20 01:28:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:25.586099 | orchestrator | 2026-04-20 01:28:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:28.630804 | orchestrator | 2026-04-20 01:28:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:28.632666 | orchestrator | 2026-04-20 01:28:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:28.632735 | orchestrator | 2026-04-20 01:28:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:31.682113 | orchestrator | 2026-04-20 01:28:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:31.684354 | orchestrator | 2026-04-20 01:28:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:31.684416 | orchestrator | 2026-04-20 01:28:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:34.733219 | orchestrator | 2026-04-20 01:28:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:34.734637 | orchestrator | 2026-04-20 01:28:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:34.734872 | orchestrator | 2026-04-20 01:28:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:37.781017 | orchestrator | 2026-04-20 01:28:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:37.781794 | orchestrator | 2026-04-20 01:28:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:37.781865 | orchestrator | 2026-04-20 01:28:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:40.826839 | orchestrator | 2026-04-20 01:28:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:40.828603 | orchestrator | 2026-04-20 01:28:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:40.828645 | orchestrator | 2026-04-20 01:28:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:43.872189 | orchestrator | 2026-04-20 01:28:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:43.874362 | orchestrator | 2026-04-20 01:28:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:43.874466 | orchestrator | 2026-04-20 01:28:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:46.921821 | orchestrator | 2026-04-20 01:28:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:46.923196 | orchestrator | 2026-04-20 01:28:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:46.923255 | orchestrator | 2026-04-20 01:28:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:49.970597 | orchestrator | 2026-04-20 01:28:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:49.972613 | orchestrator | 2026-04-20 01:28:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:49.972657 | orchestrator | 2026-04-20 01:28:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:53.019067 | orchestrator | 2026-04-20 01:28:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:53.020346 | orchestrator | 2026-04-20 01:28:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:53.020404 | orchestrator | 2026-04-20 01:28:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:56.064085 | orchestrator | 2026-04-20 01:28:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:56.066325 | orchestrator | 2026-04-20 01:28:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:56.066375 | orchestrator | 2026-04-20 01:28:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:28:59.112993 | orchestrator | 2026-04-20 01:28:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:28:59.114764 | orchestrator | 2026-04-20 01:28:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:28:59.114816 | orchestrator | 2026-04-20 01:28:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:02.159475 | orchestrator | 2026-04-20 01:29:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:02.160916 | orchestrator | 2026-04-20 01:29:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:02.161060 | orchestrator | 2026-04-20 01:29:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:05.209328 | orchestrator | 2026-04-20 01:29:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:05.211830 | orchestrator | 2026-04-20 01:29:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:05.211943 | orchestrator | 2026-04-20 01:29:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:08.251830 | orchestrator | 2026-04-20 01:29:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:08.254186 | orchestrator | 2026-04-20 01:29:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:08.254695 | orchestrator | 2026-04-20 01:29:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:11.300091 | orchestrator | 2026-04-20 01:29:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:11.301228 | orchestrator | 2026-04-20 01:29:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:11.301271 | orchestrator | 2026-04-20 01:29:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:14.351516 | orchestrator | 2026-04-20 01:29:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:14.353101 | orchestrator | 2026-04-20 01:29:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:14.353277 | orchestrator | 2026-04-20 01:29:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:17.401461 | orchestrator | 2026-04-20 01:29:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:17.402908 | orchestrator | 2026-04-20 01:29:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:17.403067 | orchestrator | 2026-04-20 01:29:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:20.445828 | orchestrator | 2026-04-20 01:29:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:20.448323 | orchestrator | 2026-04-20 01:29:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:20.448394 | orchestrator | 2026-04-20 01:29:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:23.495715 | orchestrator | 2026-04-20 01:29:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:23.497302 | orchestrator | 2026-04-20 01:29:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:23.497428 | orchestrator | 2026-04-20 01:29:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:26.534598 | orchestrator | 2026-04-20 01:29:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:26.535840 | orchestrator | 2026-04-20 01:29:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:26.535904 | orchestrator | 2026-04-20 01:29:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:29.580837 | orchestrator | 2026-04-20 01:29:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:29.582206 | orchestrator | 2026-04-20 01:29:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:29.582266 | orchestrator | 2026-04-20 01:29:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:32.630717 | orchestrator | 2026-04-20 01:29:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:32.632854 | orchestrator | 2026-04-20 01:29:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:32.632917 | orchestrator | 2026-04-20 01:29:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:35.681173 | orchestrator | 2026-04-20 01:29:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:35.682715 | orchestrator | 2026-04-20 01:29:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:35.682786 | orchestrator | 2026-04-20 01:29:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:38.734683 | orchestrator | 2026-04-20 01:29:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:38.736203 | orchestrator | 2026-04-20 01:29:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:38.736249 | orchestrator | 2026-04-20 01:29:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:41.787671 | orchestrator | 2026-04-20 01:29:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:41.789209 | orchestrator | 2026-04-20 01:29:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:41.789269 | orchestrator | 2026-04-20 01:29:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:44.837582 | orchestrator | 2026-04-20 01:29:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:44.841412 | orchestrator | 2026-04-20 01:29:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:44.841502 | orchestrator | 2026-04-20 01:29:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:47.889354 | orchestrator | 2026-04-20 01:29:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:47.890644 | orchestrator | 2026-04-20 01:29:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:47.890697 | orchestrator | 2026-04-20 01:29:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:50.932486 | orchestrator | 2026-04-20 01:29:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:50.933690 | orchestrator | 2026-04-20 01:29:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:50.933739 | orchestrator | 2026-04-20 01:29:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:53.985063 | orchestrator | 2026-04-20 01:29:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:53.987278 | orchestrator | 2026-04-20 01:29:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:53.987356 | orchestrator | 2026-04-20 01:29:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:29:57.032601 | orchestrator | 2026-04-20 01:29:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:29:57.033623 | orchestrator | 2026-04-20 01:29:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:29:57.033690 | orchestrator | 2026-04-20 01:29:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:00.076303 | orchestrator | 2026-04-20 01:30:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:00.077873 | orchestrator | 2026-04-20 01:30:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:00.077930 | orchestrator | 2026-04-20 01:30:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:03.126823 | orchestrator | 2026-04-20 01:30:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:03.127767 | orchestrator | 2026-04-20 01:30:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:03.127826 | orchestrator | 2026-04-20 01:30:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:06.173202 | orchestrator | 2026-04-20 01:30:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:06.176902 | orchestrator | 2026-04-20 01:30:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:06.177028 | orchestrator | 2026-04-20 01:30:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:09.219718 | orchestrator | 2026-04-20 01:30:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:09.221071 | orchestrator | 2026-04-20 01:30:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:09.221131 | orchestrator | 2026-04-20 01:30:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:12.266688 | orchestrator | 2026-04-20 01:30:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:12.268805 | orchestrator | 2026-04-20 01:30:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:12.268904 | orchestrator | 2026-04-20 01:30:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:15.312724 | orchestrator | 2026-04-20 01:30:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:15.314727 | orchestrator | 2026-04-20 01:30:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:15.314838 | orchestrator | 2026-04-20 01:30:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:18.360030 | orchestrator | 2026-04-20 01:30:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:18.361113 | orchestrator | 2026-04-20 01:30:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:18.361291 | orchestrator | 2026-04-20 01:30:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:21.406259 | orchestrator | 2026-04-20 01:30:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:21.408149 | orchestrator | 2026-04-20 01:30:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:21.408286 | orchestrator | 2026-04-20 01:30:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:24.450457 | orchestrator | 2026-04-20 01:30:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:24.451527 | orchestrator | 2026-04-20 01:30:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:24.451557 | orchestrator | 2026-04-20 01:30:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:27.500244 | orchestrator | 2026-04-20 01:30:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:27.502294 | orchestrator | 2026-04-20 01:30:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:27.502373 | orchestrator | 2026-04-20 01:30:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:30.553746 | orchestrator | 2026-04-20 01:30:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:30.555885 | orchestrator | 2026-04-20 01:30:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:30.555940 | orchestrator | 2026-04-20 01:30:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:33.605937 | orchestrator | 2026-04-20 01:30:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:33.607164 | orchestrator | 2026-04-20 01:30:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:33.607208 | orchestrator | 2026-04-20 01:30:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:36.655802 | orchestrator | 2026-04-20 01:30:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:36.657704 | orchestrator | 2026-04-20 01:30:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:36.657762 | orchestrator | 2026-04-20 01:30:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:39.704053 | orchestrator | 2026-04-20 01:30:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:39.704926 | orchestrator | 2026-04-20 01:30:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:39.704953 | orchestrator | 2026-04-20 01:30:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:42.749580 | orchestrator | 2026-04-20 01:30:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:42.751636 | orchestrator | 2026-04-20 01:30:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:42.751705 | orchestrator | 2026-04-20 01:30:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:45.793264 | orchestrator | 2026-04-20 01:30:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:45.794296 | orchestrator | 2026-04-20 01:30:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:45.794338 | orchestrator | 2026-04-20 01:30:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:48.835410 | orchestrator | 2026-04-20 01:30:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:48.836309 | orchestrator | 2026-04-20 01:30:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:48.836369 | orchestrator | 2026-04-20 01:30:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:51.881804 | orchestrator | 2026-04-20 01:30:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:51.882391 | orchestrator | 2026-04-20 01:30:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:51.882447 | orchestrator | 2026-04-20 01:30:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:54.931215 | orchestrator | 2026-04-20 01:30:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:54.931542 | orchestrator | 2026-04-20 01:30:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:54.931570 | orchestrator | 2026-04-20 01:30:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:30:57.977642 | orchestrator | 2026-04-20 01:30:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:30:57.978944 | orchestrator | 2026-04-20 01:30:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:30:57.979112 | orchestrator | 2026-04-20 01:30:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:01.036435 | orchestrator | 2026-04-20 01:31:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:01.036520 | orchestrator | 2026-04-20 01:31:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:01.036531 | orchestrator | 2026-04-20 01:31:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:04.091891 | orchestrator | 2026-04-20 01:31:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:04.093940 | orchestrator | 2026-04-20 01:31:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:04.094058 | orchestrator | 2026-04-20 01:31:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:07.136779 | orchestrator | 2026-04-20 01:31:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:07.138475 | orchestrator | 2026-04-20 01:31:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:07.138529 | orchestrator | 2026-04-20 01:31:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:10.181171 | orchestrator | 2026-04-20 01:31:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:10.181927 | orchestrator | 2026-04-20 01:31:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:10.181972 | orchestrator | 2026-04-20 01:31:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:13.223633 | orchestrator | 2026-04-20 01:31:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:13.225544 | orchestrator | 2026-04-20 01:31:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:13.225613 | orchestrator | 2026-04-20 01:31:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:16.266508 | orchestrator | 2026-04-20 01:31:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:16.267876 | orchestrator | 2026-04-20 01:31:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:16.267945 | orchestrator | 2026-04-20 01:31:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:19.310225 | orchestrator | 2026-04-20 01:31:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:19.310671 | orchestrator | 2026-04-20 01:31:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:19.310711 | orchestrator | 2026-04-20 01:31:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:22.355701 | orchestrator | 2026-04-20 01:31:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:22.357680 | orchestrator | 2026-04-20 01:31:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:22.357899 | orchestrator | 2026-04-20 01:31:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:25.397396 | orchestrator | 2026-04-20 01:31:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:25.398565 | orchestrator | 2026-04-20 01:31:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:25.398648 | orchestrator | 2026-04-20 01:31:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:28.440558 | orchestrator | 2026-04-20 01:31:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:28.442322 | orchestrator | 2026-04-20 01:31:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:28.442679 | orchestrator | 2026-04-20 01:31:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:31.487133 | orchestrator | 2026-04-20 01:31:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:31.488900 | orchestrator | 2026-04-20 01:31:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:31.489026 | orchestrator | 2026-04-20 01:31:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:34.531444 | orchestrator | 2026-04-20 01:31:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:34.531764 | orchestrator | 2026-04-20 01:31:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:34.532061 | orchestrator | 2026-04-20 01:31:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:37.567848 | orchestrator | 2026-04-20 01:31:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:37.567931 | orchestrator | 2026-04-20 01:31:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:37.567940 | orchestrator | 2026-04-20 01:31:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:40.611080 | orchestrator | 2026-04-20 01:31:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:40.612146 | orchestrator | 2026-04-20 01:31:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:40.612278 | orchestrator | 2026-04-20 01:31:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:43.662775 | orchestrator | 2026-04-20 01:31:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:43.664403 | orchestrator | 2026-04-20 01:31:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:43.664445 | orchestrator | 2026-04-20 01:31:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:46.716310 | orchestrator | 2026-04-20 01:31:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:46.717498 | orchestrator | 2026-04-20 01:31:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:46.717695 | orchestrator | 2026-04-20 01:31:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:49.766461 | orchestrator | 2026-04-20 01:31:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:49.769833 | orchestrator | 2026-04-20 01:31:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:49.769924 | orchestrator | 2026-04-20 01:31:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:52.818627 | orchestrator | 2026-04-20 01:31:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:52.822229 | orchestrator | 2026-04-20 01:31:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:52.822411 | orchestrator | 2026-04-20 01:31:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:55.864919 | orchestrator | 2026-04-20 01:31:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:55.866661 | orchestrator | 2026-04-20 01:31:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:55.866780 | orchestrator | 2026-04-20 01:31:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:31:58.911962 | orchestrator | 2026-04-20 01:31:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:31:58.913947 | orchestrator | 2026-04-20 01:31:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:31:58.914077 | orchestrator | 2026-04-20 01:31:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:01.961142 | orchestrator | 2026-04-20 01:32:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:01.962873 | orchestrator | 2026-04-20 01:32:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:01.962931 | orchestrator | 2026-04-20 01:32:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:05.011330 | orchestrator | 2026-04-20 01:32:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:05.012885 | orchestrator | 2026-04-20 01:32:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:05.012957 | orchestrator | 2026-04-20 01:32:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:08.056908 | orchestrator | 2026-04-20 01:32:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:08.058700 | orchestrator | 2026-04-20 01:32:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:08.058760 | orchestrator | 2026-04-20 01:32:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:11.101360 | orchestrator | 2026-04-20 01:32:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:11.103268 | orchestrator | 2026-04-20 01:32:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:11.103450 | orchestrator | 2026-04-20 01:32:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:14.152199 | orchestrator | 2026-04-20 01:32:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:14.153552 | orchestrator | 2026-04-20 01:32:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:14.153606 | orchestrator | 2026-04-20 01:32:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:17.196766 | orchestrator | 2026-04-20 01:32:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:17.197751 | orchestrator | 2026-04-20 01:32:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:17.197896 | orchestrator | 2026-04-20 01:32:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:20.249699 | orchestrator | 2026-04-20 01:32:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:20.251975 | orchestrator | 2026-04-20 01:32:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:20.252064 | orchestrator | 2026-04-20 01:32:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:23.295644 | orchestrator | 2026-04-20 01:32:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:23.297335 | orchestrator | 2026-04-20 01:32:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:23.297453 | orchestrator | 2026-04-20 01:32:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:26.344783 | orchestrator | 2026-04-20 01:32:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:26.346288 | orchestrator | 2026-04-20 01:32:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:26.346403 | orchestrator | 2026-04-20 01:32:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:29.394325 | orchestrator | 2026-04-20 01:32:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:29.395830 | orchestrator | 2026-04-20 01:32:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:29.395897 | orchestrator | 2026-04-20 01:32:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:32.443834 | orchestrator | 2026-04-20 01:32:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:32.445645 | orchestrator | 2026-04-20 01:32:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:32.445702 | orchestrator | 2026-04-20 01:32:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:35.489110 | orchestrator | 2026-04-20 01:32:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:35.489876 | orchestrator | 2026-04-20 01:32:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:35.489915 | orchestrator | 2026-04-20 01:32:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:38.533046 | orchestrator | 2026-04-20 01:32:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:38.534167 | orchestrator | 2026-04-20 01:32:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:38.534207 | orchestrator | 2026-04-20 01:32:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:41.578913 | orchestrator | 2026-04-20 01:32:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:41.581410 | orchestrator | 2026-04-20 01:32:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:41.581495 | orchestrator | 2026-04-20 01:32:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:44.628192 | orchestrator | 2026-04-20 01:32:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:44.629276 | orchestrator | 2026-04-20 01:32:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:44.629365 | orchestrator | 2026-04-20 01:32:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:47.672523 | orchestrator | 2026-04-20 01:32:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:47.674454 | orchestrator | 2026-04-20 01:32:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:47.674556 | orchestrator | 2026-04-20 01:32:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:50.723836 | orchestrator | 2026-04-20 01:32:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:50.726592 | orchestrator | 2026-04-20 01:32:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:50.726695 | orchestrator | 2026-04-20 01:32:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:53.774057 | orchestrator | 2026-04-20 01:32:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:53.776292 | orchestrator | 2026-04-20 01:32:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:53.776604 | orchestrator | 2026-04-20 01:32:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:56.825458 | orchestrator | 2026-04-20 01:32:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:56.828943 | orchestrator | 2026-04-20 01:32:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:56.829026 | orchestrator | 2026-04-20 01:32:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:32:59.875347 | orchestrator | 2026-04-20 01:32:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:32:59.876934 | orchestrator | 2026-04-20 01:32:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:32:59.877009 | orchestrator | 2026-04-20 01:32:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:02.920387 | orchestrator | 2026-04-20 01:33:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:02.921708 | orchestrator | 2026-04-20 01:33:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:02.921756 | orchestrator | 2026-04-20 01:33:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:05.966780 | orchestrator | 2026-04-20 01:33:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:05.967771 | orchestrator | 2026-04-20 01:33:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:05.967814 | orchestrator | 2026-04-20 01:33:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:09.011311 | orchestrator | 2026-04-20 01:33:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:09.012966 | orchestrator | 2026-04-20 01:33:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:09.013082 | orchestrator | 2026-04-20 01:33:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:12.054491 | orchestrator | 2026-04-20 01:33:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:12.055218 | orchestrator | 2026-04-20 01:33:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:12.055268 | orchestrator | 2026-04-20 01:33:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:15.102822 | orchestrator | 2026-04-20 01:33:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:15.104842 | orchestrator | 2026-04-20 01:33:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:15.104880 | orchestrator | 2026-04-20 01:33:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:18.150943 | orchestrator | 2026-04-20 01:33:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:18.151065 | orchestrator | 2026-04-20 01:33:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:18.151075 | orchestrator | 2026-04-20 01:33:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:21.195136 | orchestrator | 2026-04-20 01:33:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:21.196002 | orchestrator | 2026-04-20 01:33:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:21.196093 | orchestrator | 2026-04-20 01:33:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:24.237616 | orchestrator | 2026-04-20 01:33:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:24.239648 | orchestrator | 2026-04-20 01:33:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:24.239701 | orchestrator | 2026-04-20 01:33:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:27.287700 | orchestrator | 2026-04-20 01:33:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:27.289996 | orchestrator | 2026-04-20 01:33:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:27.290129 | orchestrator | 2026-04-20 01:33:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:30.333932 | orchestrator | 2026-04-20 01:33:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:30.336611 | orchestrator | 2026-04-20 01:33:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:30.336680 | orchestrator | 2026-04-20 01:33:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:33.380817 | orchestrator | 2026-04-20 01:33:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:33.383066 | orchestrator | 2026-04-20 01:33:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:33.383139 | orchestrator | 2026-04-20 01:33:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:36.434797 | orchestrator | 2026-04-20 01:33:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:36.436835 | orchestrator | 2026-04-20 01:33:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:36.436889 | orchestrator | 2026-04-20 01:33:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:39.481464 | orchestrator | 2026-04-20 01:33:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:39.482569 | orchestrator | 2026-04-20 01:33:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:39.482615 | orchestrator | 2026-04-20 01:33:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:42.527794 | orchestrator | 2026-04-20 01:33:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:42.530203 | orchestrator | 2026-04-20 01:33:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:42.530271 | orchestrator | 2026-04-20 01:33:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:45.571985 | orchestrator | 2026-04-20 01:33:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:45.574060 | orchestrator | 2026-04-20 01:33:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:45.574105 | orchestrator | 2026-04-20 01:33:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:48.619106 | orchestrator | 2026-04-20 01:33:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:48.622786 | orchestrator | 2026-04-20 01:33:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:48.622885 | orchestrator | 2026-04-20 01:33:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:51.674504 | orchestrator | 2026-04-20 01:33:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:51.676683 | orchestrator | 2026-04-20 01:33:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:51.676747 | orchestrator | 2026-04-20 01:33:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:54.722242 | orchestrator | 2026-04-20 01:33:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:54.723387 | orchestrator | 2026-04-20 01:33:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:54.723422 | orchestrator | 2026-04-20 01:33:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:33:57.768286 | orchestrator | 2026-04-20 01:33:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:33:57.771884 | orchestrator | 2026-04-20 01:33:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:33:57.771963 | orchestrator | 2026-04-20 01:33:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:00.815782 | orchestrator | 2026-04-20 01:34:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:00.816998 | orchestrator | 2026-04-20 01:34:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:00.817062 | orchestrator | 2026-04-20 01:34:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:03.861781 | orchestrator | 2026-04-20 01:34:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:03.864029 | orchestrator | 2026-04-20 01:34:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:03.864050 | orchestrator | 2026-04-20 01:34:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:06.907410 | orchestrator | 2026-04-20 01:34:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:06.908624 | orchestrator | 2026-04-20 01:34:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:06.908657 | orchestrator | 2026-04-20 01:34:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:09.953571 | orchestrator | 2026-04-20 01:34:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:09.955056 | orchestrator | 2026-04-20 01:34:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:09.955122 | orchestrator | 2026-04-20 01:34:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:13.004418 | orchestrator | 2026-04-20 01:34:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:13.006541 | orchestrator | 2026-04-20 01:34:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:13.006691 | orchestrator | 2026-04-20 01:34:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:16.051026 | orchestrator | 2026-04-20 01:34:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:16.053121 | orchestrator | 2026-04-20 01:34:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:16.053304 | orchestrator | 2026-04-20 01:34:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:19.097891 | orchestrator | 2026-04-20 01:34:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:19.097963 | orchestrator | 2026-04-20 01:34:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:19.097987 | orchestrator | 2026-04-20 01:34:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:22.146653 | orchestrator | 2026-04-20 01:34:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:22.148787 | orchestrator | 2026-04-20 01:34:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:22.148864 | orchestrator | 2026-04-20 01:34:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:25.190774 | orchestrator | 2026-04-20 01:34:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:25.193536 | orchestrator | 2026-04-20 01:34:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:25.193597 | orchestrator | 2026-04-20 01:34:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:28.237602 | orchestrator | 2026-04-20 01:34:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:28.240092 | orchestrator | 2026-04-20 01:34:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:28.240206 | orchestrator | 2026-04-20 01:34:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:31.287746 | orchestrator | 2026-04-20 01:34:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:31.289842 | orchestrator | 2026-04-20 01:34:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:31.289935 | orchestrator | 2026-04-20 01:34:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:34.336031 | orchestrator | 2026-04-20 01:34:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:34.337906 | orchestrator | 2026-04-20 01:34:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:34.337983 | orchestrator | 2026-04-20 01:34:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:37.388069 | orchestrator | 2026-04-20 01:34:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:37.389823 | orchestrator | 2026-04-20 01:34:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:37.389996 | orchestrator | 2026-04-20 01:34:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:40.437275 | orchestrator | 2026-04-20 01:34:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:40.439710 | orchestrator | 2026-04-20 01:34:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:40.439770 | orchestrator | 2026-04-20 01:34:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:43.481101 | orchestrator | 2026-04-20 01:34:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:43.481856 | orchestrator | 2026-04-20 01:34:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:43.481878 | orchestrator | 2026-04-20 01:34:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:46.534945 | orchestrator | 2026-04-20 01:34:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:46.536389 | orchestrator | 2026-04-20 01:34:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:46.536498 | orchestrator | 2026-04-20 01:34:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:49.582076 | orchestrator | 2026-04-20 01:34:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:49.582780 | orchestrator | 2026-04-20 01:34:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:49.582820 | orchestrator | 2026-04-20 01:34:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:52.625394 | orchestrator | 2026-04-20 01:34:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:52.626824 | orchestrator | 2026-04-20 01:34:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:52.626876 | orchestrator | 2026-04-20 01:34:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:55.665916 | orchestrator | 2026-04-20 01:34:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:55.667801 | orchestrator | 2026-04-20 01:34:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:55.667872 | orchestrator | 2026-04-20 01:34:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:34:58.717623 | orchestrator | 2026-04-20 01:34:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:34:58.720115 | orchestrator | 2026-04-20 01:34:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:34:58.720230 | orchestrator | 2026-04-20 01:34:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:01.767427 | orchestrator | 2026-04-20 01:35:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:01.769973 | orchestrator | 2026-04-20 01:35:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:01.770083 | orchestrator | 2026-04-20 01:35:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:04.821431 | orchestrator | 2026-04-20 01:35:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:04.823647 | orchestrator | 2026-04-20 01:35:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:04.823736 | orchestrator | 2026-04-20 01:35:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:07.878958 | orchestrator | 2026-04-20 01:35:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:07.880588 | orchestrator | 2026-04-20 01:35:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:07.880658 | orchestrator | 2026-04-20 01:35:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:10.924826 | orchestrator | 2026-04-20 01:35:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:10.929196 | orchestrator | 2026-04-20 01:35:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:10.929259 | orchestrator | 2026-04-20 01:35:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:13.981475 | orchestrator | 2026-04-20 01:35:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:13.983288 | orchestrator | 2026-04-20 01:35:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:13.983341 | orchestrator | 2026-04-20 01:35:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:17.034783 | orchestrator | 2026-04-20 01:35:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:17.036427 | orchestrator | 2026-04-20 01:35:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:17.036538 | orchestrator | 2026-04-20 01:35:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:20.087632 | orchestrator | 2026-04-20 01:35:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:20.089358 | orchestrator | 2026-04-20 01:35:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:20.089442 | orchestrator | 2026-04-20 01:35:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:23.145328 | orchestrator | 2026-04-20 01:35:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:23.150487 | orchestrator | 2026-04-20 01:35:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:23.150556 | orchestrator | 2026-04-20 01:35:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:26.192777 | orchestrator | 2026-04-20 01:35:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:26.194698 | orchestrator | 2026-04-20 01:35:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:26.194735 | orchestrator | 2026-04-20 01:35:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:29.243048 | orchestrator | 2026-04-20 01:35:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:29.245501 | orchestrator | 2026-04-20 01:35:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:29.246136 | orchestrator | 2026-04-20 01:35:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:32.292217 | orchestrator | 2026-04-20 01:35:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:32.292763 | orchestrator | 2026-04-20 01:35:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:32.292799 | orchestrator | 2026-04-20 01:35:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:35.335969 | orchestrator | 2026-04-20 01:35:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:35.337653 | orchestrator | 2026-04-20 01:35:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:35.337768 | orchestrator | 2026-04-20 01:35:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:38.386454 | orchestrator | 2026-04-20 01:35:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:38.388730 | orchestrator | 2026-04-20 01:35:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:38.388792 | orchestrator | 2026-04-20 01:35:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:41.433335 | orchestrator | 2026-04-20 01:35:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:41.434845 | orchestrator | 2026-04-20 01:35:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:41.434921 | orchestrator | 2026-04-20 01:35:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:44.480479 | orchestrator | 2026-04-20 01:35:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:44.483085 | orchestrator | 2026-04-20 01:35:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:44.483192 | orchestrator | 2026-04-20 01:35:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:47.533962 | orchestrator | 2026-04-20 01:35:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:47.536226 | orchestrator | 2026-04-20 01:35:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:47.536314 | orchestrator | 2026-04-20 01:35:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:50.578523 | orchestrator | 2026-04-20 01:35:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:50.580195 | orchestrator | 2026-04-20 01:35:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:50.580245 | orchestrator | 2026-04-20 01:35:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:53.632998 | orchestrator | 2026-04-20 01:35:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:53.634759 | orchestrator | 2026-04-20 01:35:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:53.634828 | orchestrator | 2026-04-20 01:35:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:56.682219 | orchestrator | 2026-04-20 01:35:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:56.684259 | orchestrator | 2026-04-20 01:35:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:56.684321 | orchestrator | 2026-04-20 01:35:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:35:59.728070 | orchestrator | 2026-04-20 01:35:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:35:59.729721 | orchestrator | 2026-04-20 01:35:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:35:59.729782 | orchestrator | 2026-04-20 01:35:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:02.775575 | orchestrator | 2026-04-20 01:36:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:02.777033 | orchestrator | 2026-04-20 01:36:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:02.777401 | orchestrator | 2026-04-20 01:36:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:05.824459 | orchestrator | 2026-04-20 01:36:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:05.827284 | orchestrator | 2026-04-20 01:36:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:05.827377 | orchestrator | 2026-04-20 01:36:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:08.872583 | orchestrator | 2026-04-20 01:36:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:08.875330 | orchestrator | 2026-04-20 01:36:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:08.875448 | orchestrator | 2026-04-20 01:36:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:11.923660 | orchestrator | 2026-04-20 01:36:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:11.925760 | orchestrator | 2026-04-20 01:36:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:11.925823 | orchestrator | 2026-04-20 01:36:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:14.972064 | orchestrator | 2026-04-20 01:36:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:14.974716 | orchestrator | 2026-04-20 01:36:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:14.974832 | orchestrator | 2026-04-20 01:36:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:18.022065 | orchestrator | 2026-04-20 01:36:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:18.023560 | orchestrator | 2026-04-20 01:36:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:18.023633 | orchestrator | 2026-04-20 01:36:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:21.064356 | orchestrator | 2026-04-20 01:36:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:21.065886 | orchestrator | 2026-04-20 01:36:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:21.065923 | orchestrator | 2026-04-20 01:36:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:24.110937 | orchestrator | 2026-04-20 01:36:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:24.112199 | orchestrator | 2026-04-20 01:36:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:24.112305 | orchestrator | 2026-04-20 01:36:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:27.158826 | orchestrator | 2026-04-20 01:36:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:27.160060 | orchestrator | 2026-04-20 01:36:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:27.160158 | orchestrator | 2026-04-20 01:36:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:30.208787 | orchestrator | 2026-04-20 01:36:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:30.210399 | orchestrator | 2026-04-20 01:36:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:30.210485 | orchestrator | 2026-04-20 01:36:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:33.255487 | orchestrator | 2026-04-20 01:36:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:33.257010 | orchestrator | 2026-04-20 01:36:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:33.257165 | orchestrator | 2026-04-20 01:36:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:36.299918 | orchestrator | 2026-04-20 01:36:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:36.302420 | orchestrator | 2026-04-20 01:36:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:36.302510 | orchestrator | 2026-04-20 01:36:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:39.345298 | orchestrator | 2026-04-20 01:36:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:39.348295 | orchestrator | 2026-04-20 01:36:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:39.348374 | orchestrator | 2026-04-20 01:36:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:42.391890 | orchestrator | 2026-04-20 01:36:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:42.393964 | orchestrator | 2026-04-20 01:36:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:42.394132 | orchestrator | 2026-04-20 01:36:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:45.438068 | orchestrator | 2026-04-20 01:36:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:45.440847 | orchestrator | 2026-04-20 01:36:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:45.440907 | orchestrator | 2026-04-20 01:36:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:48.501724 | orchestrator | 2026-04-20 01:36:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:48.502637 | orchestrator | 2026-04-20 01:36:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:48.502660 | orchestrator | 2026-04-20 01:36:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:51.550488 | orchestrator | 2026-04-20 01:36:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:51.553791 | orchestrator | 2026-04-20 01:36:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:51.553899 | orchestrator | 2026-04-20 01:36:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:54.603617 | orchestrator | 2026-04-20 01:36:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:54.607082 | orchestrator | 2026-04-20 01:36:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:54.607200 | orchestrator | 2026-04-20 01:36:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:36:57.653451 | orchestrator | 2026-04-20 01:36:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:36:57.655510 | orchestrator | 2026-04-20 01:36:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:36:57.655583 | orchestrator | 2026-04-20 01:36:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:00.703347 | orchestrator | 2026-04-20 01:37:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:00.705806 | orchestrator | 2026-04-20 01:37:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:00.705854 | orchestrator | 2026-04-20 01:37:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:03.753378 | orchestrator | 2026-04-20 01:37:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:03.755170 | orchestrator | 2026-04-20 01:37:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:03.755270 | orchestrator | 2026-04-20 01:37:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:06.799376 | orchestrator | 2026-04-20 01:37:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:06.801458 | orchestrator | 2026-04-20 01:37:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:06.801550 | orchestrator | 2026-04-20 01:37:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:09.853411 | orchestrator | 2026-04-20 01:37:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:09.854878 | orchestrator | 2026-04-20 01:37:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:09.854922 | orchestrator | 2026-04-20 01:37:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:12.899162 | orchestrator | 2026-04-20 01:37:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:12.901671 | orchestrator | 2026-04-20 01:37:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:12.901734 | orchestrator | 2026-04-20 01:37:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:15.948616 | orchestrator | 2026-04-20 01:37:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:15.950406 | orchestrator | 2026-04-20 01:37:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:15.950577 | orchestrator | 2026-04-20 01:37:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:18.998142 | orchestrator | 2026-04-20 01:37:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:18.999524 | orchestrator | 2026-04-20 01:37:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:18.999584 | orchestrator | 2026-04-20 01:37:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:22.053905 | orchestrator | 2026-04-20 01:37:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:22.054395 | orchestrator | 2026-04-20 01:37:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:22.054419 | orchestrator | 2026-04-20 01:37:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:25.103642 | orchestrator | 2026-04-20 01:37:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:25.104812 | orchestrator | 2026-04-20 01:37:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:25.105015 | orchestrator | 2026-04-20 01:37:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:28.152169 | orchestrator | 2026-04-20 01:37:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:28.153959 | orchestrator | 2026-04-20 01:37:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:28.154076 | orchestrator | 2026-04-20 01:37:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:31.203295 | orchestrator | 2026-04-20 01:37:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:31.205054 | orchestrator | 2026-04-20 01:37:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:31.205113 | orchestrator | 2026-04-20 01:37:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:34.255101 | orchestrator | 2026-04-20 01:37:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:34.256480 | orchestrator | 2026-04-20 01:37:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:34.256557 | orchestrator | 2026-04-20 01:37:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:37.304105 | orchestrator | 2026-04-20 01:37:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:37.305882 | orchestrator | 2026-04-20 01:37:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:37.306062 | orchestrator | 2026-04-20 01:37:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:40.353326 | orchestrator | 2026-04-20 01:37:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:40.357204 | orchestrator | 2026-04-20 01:37:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:40.357283 | orchestrator | 2026-04-20 01:37:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:43.401968 | orchestrator | 2026-04-20 01:37:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:43.403598 | orchestrator | 2026-04-20 01:37:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:43.403656 | orchestrator | 2026-04-20 01:37:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:46.446867 | orchestrator | 2026-04-20 01:37:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:46.448932 | orchestrator | 2026-04-20 01:37:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:46.449050 | orchestrator | 2026-04-20 01:37:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:49.492951 | orchestrator | 2026-04-20 01:37:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:49.494077 | orchestrator | 2026-04-20 01:37:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:49.494298 | orchestrator | 2026-04-20 01:37:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:52.541247 | orchestrator | 2026-04-20 01:37:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:52.542706 | orchestrator | 2026-04-20 01:37:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:52.542805 | orchestrator | 2026-04-20 01:37:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:55.593303 | orchestrator | 2026-04-20 01:37:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:55.595369 | orchestrator | 2026-04-20 01:37:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:55.595447 | orchestrator | 2026-04-20 01:37:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:37:58.637255 | orchestrator | 2026-04-20 01:37:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:37:58.639546 | orchestrator | 2026-04-20 01:37:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:37:58.639783 | orchestrator | 2026-04-20 01:37:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:01.683536 | orchestrator | 2026-04-20 01:38:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:01.683939 | orchestrator | 2026-04-20 01:38:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:01.683985 | orchestrator | 2026-04-20 01:38:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:04.733298 | orchestrator | 2026-04-20 01:38:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:04.735515 | orchestrator | 2026-04-20 01:38:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:04.735573 | orchestrator | 2026-04-20 01:38:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:07.779857 | orchestrator | 2026-04-20 01:38:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:07.781814 | orchestrator | 2026-04-20 01:38:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:07.781981 | orchestrator | 2026-04-20 01:38:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:10.826419 | orchestrator | 2026-04-20 01:38:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:10.828655 | orchestrator | 2026-04-20 01:38:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:10.828741 | orchestrator | 2026-04-20 01:38:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:13.873055 | orchestrator | 2026-04-20 01:38:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:13.876809 | orchestrator | 2026-04-20 01:38:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:13.876922 | orchestrator | 2026-04-20 01:38:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:16.923447 | orchestrator | 2026-04-20 01:38:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:16.926114 | orchestrator | 2026-04-20 01:38:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:16.926254 | orchestrator | 2026-04-20 01:38:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:19.972722 | orchestrator | 2026-04-20 01:38:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:19.974340 | orchestrator | 2026-04-20 01:38:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:19.974377 | orchestrator | 2026-04-20 01:38:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:23.021720 | orchestrator | 2026-04-20 01:38:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:23.024100 | orchestrator | 2026-04-20 01:38:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:23.024164 | orchestrator | 2026-04-20 01:38:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:26.076029 | orchestrator | 2026-04-20 01:38:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:26.077830 | orchestrator | 2026-04-20 01:38:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:26.077870 | orchestrator | 2026-04-20 01:38:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:29.118499 | orchestrator | 2026-04-20 01:38:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:29.119509 | orchestrator | 2026-04-20 01:38:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:29.119599 | orchestrator | 2026-04-20 01:38:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:32.161640 | orchestrator | 2026-04-20 01:38:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:32.163079 | orchestrator | 2026-04-20 01:38:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:32.163140 | orchestrator | 2026-04-20 01:38:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:35.210575 | orchestrator | 2026-04-20 01:38:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:35.212575 | orchestrator | 2026-04-20 01:38:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:35.212624 | orchestrator | 2026-04-20 01:38:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:38.259842 | orchestrator | 2026-04-20 01:38:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:38.261629 | orchestrator | 2026-04-20 01:38:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:38.261703 | orchestrator | 2026-04-20 01:38:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:41.304310 | orchestrator | 2026-04-20 01:38:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:41.306329 | orchestrator | 2026-04-20 01:38:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:41.306421 | orchestrator | 2026-04-20 01:38:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:44.343795 | orchestrator | 2026-04-20 01:38:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:44.345873 | orchestrator | 2026-04-20 01:38:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:44.345928 | orchestrator | 2026-04-20 01:38:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:47.393084 | orchestrator | 2026-04-20 01:38:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:47.395298 | orchestrator | 2026-04-20 01:38:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:47.395361 | orchestrator | 2026-04-20 01:38:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:50.442270 | orchestrator | 2026-04-20 01:38:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:50.443781 | orchestrator | 2026-04-20 01:38:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:50.443842 | orchestrator | 2026-04-20 01:38:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:53.489068 | orchestrator | 2026-04-20 01:38:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:53.491430 | orchestrator | 2026-04-20 01:38:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:53.491491 | orchestrator | 2026-04-20 01:38:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:56.542297 | orchestrator | 2026-04-20 01:38:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:56.544312 | orchestrator | 2026-04-20 01:38:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:56.544601 | orchestrator | 2026-04-20 01:38:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:38:59.586906 | orchestrator | 2026-04-20 01:38:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:38:59.588300 | orchestrator | 2026-04-20 01:38:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:38:59.588346 | orchestrator | 2026-04-20 01:38:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:02.626004 | orchestrator | 2026-04-20 01:39:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:02.627547 | orchestrator | 2026-04-20 01:39:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:02.627577 | orchestrator | 2026-04-20 01:39:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:05.664820 | orchestrator | 2026-04-20 01:39:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:05.666360 | orchestrator | 2026-04-20 01:39:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:05.666417 | orchestrator | 2026-04-20 01:39:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:08.708021 | orchestrator | 2026-04-20 01:39:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:08.711268 | orchestrator | 2026-04-20 01:39:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:08.711381 | orchestrator | 2026-04-20 01:39:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:11.758478 | orchestrator | 2026-04-20 01:39:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:11.760086 | orchestrator | 2026-04-20 01:39:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:11.760222 | orchestrator | 2026-04-20 01:39:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:14.807875 | orchestrator | 2026-04-20 01:39:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:14.808831 | orchestrator | 2026-04-20 01:39:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:14.808876 | orchestrator | 2026-04-20 01:39:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:17.860240 | orchestrator | 2026-04-20 01:39:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:17.861632 | orchestrator | 2026-04-20 01:39:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:17.861683 | orchestrator | 2026-04-20 01:39:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:20.911518 | orchestrator | 2026-04-20 01:39:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:20.913273 | orchestrator | 2026-04-20 01:39:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:20.913340 | orchestrator | 2026-04-20 01:39:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:23.956312 | orchestrator | 2026-04-20 01:39:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:23.957911 | orchestrator | 2026-04-20 01:39:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:23.957976 | orchestrator | 2026-04-20 01:39:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:26.999496 | orchestrator | 2026-04-20 01:39:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:27.001696 | orchestrator | 2026-04-20 01:39:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:27.001737 | orchestrator | 2026-04-20 01:39:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:30.049808 | orchestrator | 2026-04-20 01:39:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:30.051018 | orchestrator | 2026-04-20 01:39:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:30.051047 | orchestrator | 2026-04-20 01:39:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:33.098946 | orchestrator | 2026-04-20 01:39:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:33.100765 | orchestrator | 2026-04-20 01:39:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:33.100892 | orchestrator | 2026-04-20 01:39:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:36.148428 | orchestrator | 2026-04-20 01:39:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:36.149702 | orchestrator | 2026-04-20 01:39:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:36.149828 | orchestrator | 2026-04-20 01:39:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:39.196024 | orchestrator | 2026-04-20 01:39:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:39.197960 | orchestrator | 2026-04-20 01:39:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:39.198157 | orchestrator | 2026-04-20 01:39:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:42.241439 | orchestrator | 2026-04-20 01:39:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:42.243933 | orchestrator | 2026-04-20 01:39:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:42.244019 | orchestrator | 2026-04-20 01:39:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:45.292184 | orchestrator | 2026-04-20 01:39:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:45.293510 | orchestrator | 2026-04-20 01:39:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:45.293561 | orchestrator | 2026-04-20 01:39:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:48.346340 | orchestrator | 2026-04-20 01:39:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:48.348334 | orchestrator | 2026-04-20 01:39:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:48.348383 | orchestrator | 2026-04-20 01:39:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:51.395450 | orchestrator | 2026-04-20 01:39:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:51.397516 | orchestrator | 2026-04-20 01:39:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:51.397582 | orchestrator | 2026-04-20 01:39:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:54.441095 | orchestrator | 2026-04-20 01:39:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:54.441981 | orchestrator | 2026-04-20 01:39:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:54.442266 | orchestrator | 2026-04-20 01:39:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:39:57.484805 | orchestrator | 2026-04-20 01:39:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:39:57.486232 | orchestrator | 2026-04-20 01:39:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:39:57.486279 | orchestrator | 2026-04-20 01:39:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:00.535084 | orchestrator | 2026-04-20 01:40:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:00.536005 | orchestrator | 2026-04-20 01:40:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:00.536050 | orchestrator | 2026-04-20 01:40:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:03.579698 | orchestrator | 2026-04-20 01:40:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:03.580351 | orchestrator | 2026-04-20 01:40:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:03.580482 | orchestrator | 2026-04-20 01:40:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:06.624255 | orchestrator | 2026-04-20 01:40:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:06.624351 | orchestrator | 2026-04-20 01:40:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:06.624368 | orchestrator | 2026-04-20 01:40:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:09.665102 | orchestrator | 2026-04-20 01:40:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:09.666954 | orchestrator | 2026-04-20 01:40:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:09.667129 | orchestrator | 2026-04-20 01:40:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:12.713401 | orchestrator | 2026-04-20 01:40:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:12.714808 | orchestrator | 2026-04-20 01:40:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:12.714868 | orchestrator | 2026-04-20 01:40:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:15.761331 | orchestrator | 2026-04-20 01:40:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:15.762064 | orchestrator | 2026-04-20 01:40:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:15.762137 | orchestrator | 2026-04-20 01:40:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:18.809997 | orchestrator | 2026-04-20 01:40:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:18.812311 | orchestrator | 2026-04-20 01:40:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:18.812372 | orchestrator | 2026-04-20 01:40:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:21.859530 | orchestrator | 2026-04-20 01:40:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:21.860270 | orchestrator | 2026-04-20 01:40:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:21.860308 | orchestrator | 2026-04-20 01:40:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:24.904916 | orchestrator | 2026-04-20 01:40:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:24.905147 | orchestrator | 2026-04-20 01:40:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:24.905191 | orchestrator | 2026-04-20 01:40:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:27.947813 | orchestrator | 2026-04-20 01:40:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:27.949294 | orchestrator | 2026-04-20 01:40:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:27.949358 | orchestrator | 2026-04-20 01:40:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:30.992281 | orchestrator | 2026-04-20 01:40:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:30.992855 | orchestrator | 2026-04-20 01:40:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:30.992916 | orchestrator | 2026-04-20 01:40:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:34.034703 | orchestrator | 2026-04-20 01:40:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:34.036707 | orchestrator | 2026-04-20 01:40:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:34.036753 | orchestrator | 2026-04-20 01:40:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:37.080200 | orchestrator | 2026-04-20 01:40:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:37.081409 | orchestrator | 2026-04-20 01:40:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:37.081599 | orchestrator | 2026-04-20 01:40:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:40.131677 | orchestrator | 2026-04-20 01:40:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:40.132110 | orchestrator | 2026-04-20 01:40:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:40.132140 | orchestrator | 2026-04-20 01:40:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:43.184400 | orchestrator | 2026-04-20 01:40:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:43.186159 | orchestrator | 2026-04-20 01:40:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:43.186303 | orchestrator | 2026-04-20 01:40:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:46.246318 | orchestrator | 2026-04-20 01:40:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:46.247430 | orchestrator | 2026-04-20 01:40:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:46.247619 | orchestrator | 2026-04-20 01:40:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:49.294374 | orchestrator | 2026-04-20 01:40:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:49.296326 | orchestrator | 2026-04-20 01:40:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:49.296431 | orchestrator | 2026-04-20 01:40:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:52.344505 | orchestrator | 2026-04-20 01:40:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:52.345387 | orchestrator | 2026-04-20 01:40:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:52.345418 | orchestrator | 2026-04-20 01:40:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:55.396068 | orchestrator | 2026-04-20 01:40:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:56.051012 | orchestrator | 2026-04-20 01:40:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:56.051066 | orchestrator | 2026-04-20 01:40:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:40:58.443006 | orchestrator | 2026-04-20 01:40:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:40:58.445068 | orchestrator | 2026-04-20 01:40:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:40:58.445127 | orchestrator | 2026-04-20 01:40:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:01.489720 | orchestrator | 2026-04-20 01:41:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:01.491575 | orchestrator | 2026-04-20 01:41:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:01.491624 | orchestrator | 2026-04-20 01:41:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:04.542764 | orchestrator | 2026-04-20 01:41:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:04.545376 | orchestrator | 2026-04-20 01:41:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:04.545449 | orchestrator | 2026-04-20 01:41:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:07.583688 | orchestrator | 2026-04-20 01:41:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:07.585335 | orchestrator | 2026-04-20 01:41:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:07.585668 | orchestrator | 2026-04-20 01:41:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:10.634508 | orchestrator | 2026-04-20 01:41:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:10.636770 | orchestrator | 2026-04-20 01:41:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:10.637282 | orchestrator | 2026-04-20 01:41:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:13.682552 | orchestrator | 2026-04-20 01:41:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:13.685300 | orchestrator | 2026-04-20 01:41:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:13.685360 | orchestrator | 2026-04-20 01:41:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:16.741356 | orchestrator | 2026-04-20 01:41:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:16.743937 | orchestrator | 2026-04-20 01:41:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:16.744047 | orchestrator | 2026-04-20 01:41:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:19.791217 | orchestrator | 2026-04-20 01:41:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:19.792678 | orchestrator | 2026-04-20 01:41:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:19.792780 | orchestrator | 2026-04-20 01:41:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:22.843668 | orchestrator | 2026-04-20 01:41:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:22.845690 | orchestrator | 2026-04-20 01:41:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:22.845767 | orchestrator | 2026-04-20 01:41:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:25.896901 | orchestrator | 2026-04-20 01:41:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:25.898826 | orchestrator | 2026-04-20 01:41:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:25.898877 | orchestrator | 2026-04-20 01:41:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:28.947043 | orchestrator | 2026-04-20 01:41:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:28.948236 | orchestrator | 2026-04-20 01:41:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:28.948460 | orchestrator | 2026-04-20 01:41:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:32.004099 | orchestrator | 2026-04-20 01:41:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:32.006518 | orchestrator | 2026-04-20 01:41:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:32.006609 | orchestrator | 2026-04-20 01:41:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:35.058975 | orchestrator | 2026-04-20 01:41:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:35.061593 | orchestrator | 2026-04-20 01:41:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:35.061801 | orchestrator | 2026-04-20 01:41:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:38.112840 | orchestrator | 2026-04-20 01:41:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:38.114700 | orchestrator | 2026-04-20 01:41:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:38.114755 | orchestrator | 2026-04-20 01:41:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:41.161956 | orchestrator | 2026-04-20 01:41:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:41.164032 | orchestrator | 2026-04-20 01:41:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:41.164106 | orchestrator | 2026-04-20 01:41:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:44.210992 | orchestrator | 2026-04-20 01:41:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:44.212192 | orchestrator | 2026-04-20 01:41:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:44.212270 | orchestrator | 2026-04-20 01:41:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:47.263666 | orchestrator | 2026-04-20 01:41:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:47.265615 | orchestrator | 2026-04-20 01:41:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:47.265636 | orchestrator | 2026-04-20 01:41:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:50.301105 | orchestrator | 2026-04-20 01:41:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:50.302008 | orchestrator | 2026-04-20 01:41:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:50.302110 | orchestrator | 2026-04-20 01:41:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:53.337815 | orchestrator | 2026-04-20 01:41:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:53.339129 | orchestrator | 2026-04-20 01:41:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:53.339228 | orchestrator | 2026-04-20 01:41:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:56.389929 | orchestrator | 2026-04-20 01:41:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:56.391443 | orchestrator | 2026-04-20 01:41:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:56.391497 | orchestrator | 2026-04-20 01:41:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:41:59.435145 | orchestrator | 2026-04-20 01:41:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:41:59.436779 | orchestrator | 2026-04-20 01:41:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:41:59.436813 | orchestrator | 2026-04-20 01:41:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:02.479292 | orchestrator | 2026-04-20 01:42:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:02.481130 | orchestrator | 2026-04-20 01:42:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:02.481239 | orchestrator | 2026-04-20 01:42:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:05.523527 | orchestrator | 2026-04-20 01:42:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:05.526205 | orchestrator | 2026-04-20 01:42:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:05.526265 | orchestrator | 2026-04-20 01:42:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:08.570152 | orchestrator | 2026-04-20 01:42:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:08.571831 | orchestrator | 2026-04-20 01:42:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:08.571910 | orchestrator | 2026-04-20 01:42:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:11.614813 | orchestrator | 2026-04-20 01:42:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:11.615740 | orchestrator | 2026-04-20 01:42:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:11.615858 | orchestrator | 2026-04-20 01:42:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:14.660735 | orchestrator | 2026-04-20 01:42:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:14.662143 | orchestrator | 2026-04-20 01:42:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:14.662273 | orchestrator | 2026-04-20 01:42:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:17.711556 | orchestrator | 2026-04-20 01:42:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:17.714453 | orchestrator | 2026-04-20 01:42:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:17.714519 | orchestrator | 2026-04-20 01:42:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:20.765704 | orchestrator | 2026-04-20 01:42:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:20.771526 | orchestrator | 2026-04-20 01:42:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:20.771625 | orchestrator | 2026-04-20 01:42:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:23.819918 | orchestrator | 2026-04-20 01:42:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:23.821393 | orchestrator | 2026-04-20 01:42:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:23.821448 | orchestrator | 2026-04-20 01:42:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:26.872941 | orchestrator | 2026-04-20 01:42:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:26.875555 | orchestrator | 2026-04-20 01:42:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:26.875657 | orchestrator | 2026-04-20 01:42:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:29.924752 | orchestrator | 2026-04-20 01:42:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:29.926357 | orchestrator | 2026-04-20 01:42:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:29.926390 | orchestrator | 2026-04-20 01:42:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:32.965287 | orchestrator | 2026-04-20 01:42:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:32.965462 | orchestrator | 2026-04-20 01:42:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:32.965485 | orchestrator | 2026-04-20 01:42:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:36.011943 | orchestrator | 2026-04-20 01:42:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:36.014368 | orchestrator | 2026-04-20 01:42:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:36.014423 | orchestrator | 2026-04-20 01:42:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:39.062246 | orchestrator | 2026-04-20 01:42:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:39.064040 | orchestrator | 2026-04-20 01:42:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:39.064253 | orchestrator | 2026-04-20 01:42:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:42.106305 | orchestrator | 2026-04-20 01:42:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:42.108742 | orchestrator | 2026-04-20 01:42:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:42.108804 | orchestrator | 2026-04-20 01:42:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:45.155644 | orchestrator | 2026-04-20 01:42:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:45.157452 | orchestrator | 2026-04-20 01:42:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:45.157504 | orchestrator | 2026-04-20 01:42:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:48.207228 | orchestrator | 2026-04-20 01:42:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:48.208998 | orchestrator | 2026-04-20 01:42:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:48.209054 | orchestrator | 2026-04-20 01:42:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:51.263236 | orchestrator | 2026-04-20 01:42:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:51.264726 | orchestrator | 2026-04-20 01:42:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:51.264868 | orchestrator | 2026-04-20 01:42:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:54.315426 | orchestrator | 2026-04-20 01:42:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:54.318257 | orchestrator | 2026-04-20 01:42:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:54.318308 | orchestrator | 2026-04-20 01:42:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:42:57.366581 | orchestrator | 2026-04-20 01:42:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:42:57.369660 | orchestrator | 2026-04-20 01:42:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:42:57.369714 | orchestrator | 2026-04-20 01:42:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:00.414531 | orchestrator | 2026-04-20 01:43:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:00.416545 | orchestrator | 2026-04-20 01:43:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:00.416615 | orchestrator | 2026-04-20 01:43:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:03.470965 | orchestrator | 2026-04-20 01:43:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:03.471712 | orchestrator | 2026-04-20 01:43:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:03.471807 | orchestrator | 2026-04-20 01:43:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:06.522225 | orchestrator | 2026-04-20 01:43:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:06.523328 | orchestrator | 2026-04-20 01:43:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:06.523386 | orchestrator | 2026-04-20 01:43:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:09.567211 | orchestrator | 2026-04-20 01:43:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:09.567923 | orchestrator | 2026-04-20 01:43:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:09.567955 | orchestrator | 2026-04-20 01:43:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:12.611687 | orchestrator | 2026-04-20 01:43:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:12.612690 | orchestrator | 2026-04-20 01:43:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:12.612897 | orchestrator | 2026-04-20 01:43:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:15.660548 | orchestrator | 2026-04-20 01:43:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:15.661637 | orchestrator | 2026-04-20 01:43:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:15.661678 | orchestrator | 2026-04-20 01:43:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:18.710128 | orchestrator | 2026-04-20 01:43:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:18.713120 | orchestrator | 2026-04-20 01:43:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:18.713342 | orchestrator | 2026-04-20 01:43:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:21.759142 | orchestrator | 2026-04-20 01:43:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:21.760304 | orchestrator | 2026-04-20 01:43:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:21.760361 | orchestrator | 2026-04-20 01:43:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:24.808101 | orchestrator | 2026-04-20 01:43:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:24.810415 | orchestrator | 2026-04-20 01:43:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:24.811025 | orchestrator | 2026-04-20 01:43:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:27.854442 | orchestrator | 2026-04-20 01:43:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:27.855987 | orchestrator | 2026-04-20 01:43:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:27.856036 | orchestrator | 2026-04-20 01:43:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:30.900857 | orchestrator | 2026-04-20 01:43:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:30.902586 | orchestrator | 2026-04-20 01:43:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:30.902671 | orchestrator | 2026-04-20 01:43:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:33.955267 | orchestrator | 2026-04-20 01:43:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:33.956449 | orchestrator | 2026-04-20 01:43:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:33.956500 | orchestrator | 2026-04-20 01:43:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:37.002589 | orchestrator | 2026-04-20 01:43:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:37.004293 | orchestrator | 2026-04-20 01:43:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:37.004339 | orchestrator | 2026-04-20 01:43:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:40.051497 | orchestrator | 2026-04-20 01:43:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:40.056428 | orchestrator | 2026-04-20 01:43:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:40.056545 | orchestrator | 2026-04-20 01:43:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:43.101911 | orchestrator | 2026-04-20 01:43:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:43.104040 | orchestrator | 2026-04-20 01:43:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:43.104089 | orchestrator | 2026-04-20 01:43:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:46.150906 | orchestrator | 2026-04-20 01:43:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:46.152387 | orchestrator | 2026-04-20 01:43:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:46.152478 | orchestrator | 2026-04-20 01:43:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:49.202355 | orchestrator | 2026-04-20 01:43:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:49.205493 | orchestrator | 2026-04-20 01:43:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:49.205553 | orchestrator | 2026-04-20 01:43:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:52.247057 | orchestrator | 2026-04-20 01:43:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:52.249588 | orchestrator | 2026-04-20 01:43:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:52.249867 | orchestrator | 2026-04-20 01:43:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:55.296439 | orchestrator | 2026-04-20 01:43:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:55.299645 | orchestrator | 2026-04-20 01:43:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:55.299951 | orchestrator | 2026-04-20 01:43:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:43:58.352251 | orchestrator | 2026-04-20 01:43:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:43:58.354479 | orchestrator | 2026-04-20 01:43:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:43:58.354530 | orchestrator | 2026-04-20 01:43:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:01.402430 | orchestrator | 2026-04-20 01:44:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:01.404071 | orchestrator | 2026-04-20 01:44:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:01.404331 | orchestrator | 2026-04-20 01:44:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:04.451495 | orchestrator | 2026-04-20 01:44:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:04.453464 | orchestrator | 2026-04-20 01:44:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:04.453528 | orchestrator | 2026-04-20 01:44:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:07.495682 | orchestrator | 2026-04-20 01:44:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:07.497036 | orchestrator | 2026-04-20 01:44:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:07.497110 | orchestrator | 2026-04-20 01:44:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:10.545346 | orchestrator | 2026-04-20 01:44:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:10.547190 | orchestrator | 2026-04-20 01:44:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:10.547244 | orchestrator | 2026-04-20 01:44:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:13.594770 | orchestrator | 2026-04-20 01:44:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:13.596358 | orchestrator | 2026-04-20 01:44:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:13.596448 | orchestrator | 2026-04-20 01:44:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:16.644755 | orchestrator | 2026-04-20 01:44:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:16.648044 | orchestrator | 2026-04-20 01:44:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:16.648123 | orchestrator | 2026-04-20 01:44:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:19.697660 | orchestrator | 2026-04-20 01:44:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:19.699505 | orchestrator | 2026-04-20 01:44:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:19.699779 | orchestrator | 2026-04-20 01:44:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:22.742768 | orchestrator | 2026-04-20 01:44:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:22.743830 | orchestrator | 2026-04-20 01:44:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:22.743866 | orchestrator | 2026-04-20 01:44:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:25.794899 | orchestrator | 2026-04-20 01:44:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:25.797107 | orchestrator | 2026-04-20 01:44:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:25.797195 | orchestrator | 2026-04-20 01:44:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:28.841124 | orchestrator | 2026-04-20 01:44:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:28.842449 | orchestrator | 2026-04-20 01:44:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:28.842498 | orchestrator | 2026-04-20 01:44:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:31.886705 | orchestrator | 2026-04-20 01:44:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:31.888915 | orchestrator | 2026-04-20 01:44:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:31.889088 | orchestrator | 2026-04-20 01:44:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:34.940797 | orchestrator | 2026-04-20 01:44:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:34.942285 | orchestrator | 2026-04-20 01:44:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:34.942344 | orchestrator | 2026-04-20 01:44:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:37.991477 | orchestrator | 2026-04-20 01:44:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:37.993771 | orchestrator | 2026-04-20 01:44:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:37.993843 | orchestrator | 2026-04-20 01:44:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:41.040499 | orchestrator | 2026-04-20 01:44:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:41.042422 | orchestrator | 2026-04-20 01:44:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:41.042477 | orchestrator | 2026-04-20 01:44:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:44.087667 | orchestrator | 2026-04-20 01:44:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:44.089761 | orchestrator | 2026-04-20 01:44:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:44.089857 | orchestrator | 2026-04-20 01:44:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:47.134656 | orchestrator | 2026-04-20 01:44:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:47.136485 | orchestrator | 2026-04-20 01:44:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:47.136531 | orchestrator | 2026-04-20 01:44:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:50.186059 | orchestrator | 2026-04-20 01:44:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:50.188100 | orchestrator | 2026-04-20 01:44:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:50.188140 | orchestrator | 2026-04-20 01:44:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:53.241045 | orchestrator | 2026-04-20 01:44:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:53.243795 | orchestrator | 2026-04-20 01:44:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:53.243840 | orchestrator | 2026-04-20 01:44:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:56.290779 | orchestrator | 2026-04-20 01:44:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:56.293573 | orchestrator | 2026-04-20 01:44:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:56.293628 | orchestrator | 2026-04-20 01:44:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:44:59.344740 | orchestrator | 2026-04-20 01:44:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:44:59.347487 | orchestrator | 2026-04-20 01:44:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:44:59.347560 | orchestrator | 2026-04-20 01:44:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:02.392648 | orchestrator | 2026-04-20 01:45:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:02.394689 | orchestrator | 2026-04-20 01:45:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:02.394934 | orchestrator | 2026-04-20 01:45:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:05.441994 | orchestrator | 2026-04-20 01:45:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:05.443434 | orchestrator | 2026-04-20 01:45:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:05.444345 | orchestrator | 2026-04-20 01:45:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:08.490667 | orchestrator | 2026-04-20 01:45:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:08.493912 | orchestrator | 2026-04-20 01:45:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:08.494007 | orchestrator | 2026-04-20 01:45:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:11.540273 | orchestrator | 2026-04-20 01:45:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:11.541757 | orchestrator | 2026-04-20 01:45:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:11.541787 | orchestrator | 2026-04-20 01:45:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:14.594075 | orchestrator | 2026-04-20 01:45:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:14.595687 | orchestrator | 2026-04-20 01:45:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:14.595963 | orchestrator | 2026-04-20 01:45:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:17.641639 | orchestrator | 2026-04-20 01:45:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:17.644995 | orchestrator | 2026-04-20 01:45:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:17.645071 | orchestrator | 2026-04-20 01:45:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:20.700800 | orchestrator | 2026-04-20 01:45:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:20.703471 | orchestrator | 2026-04-20 01:45:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:20.703545 | orchestrator | 2026-04-20 01:45:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:23.745023 | orchestrator | 2026-04-20 01:45:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:23.747122 | orchestrator | 2026-04-20 01:45:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:23.747280 | orchestrator | 2026-04-20 01:45:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:26.798100 | orchestrator | 2026-04-20 01:45:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:26.800516 | orchestrator | 2026-04-20 01:45:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:26.800668 | orchestrator | 2026-04-20 01:45:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:29.848399 | orchestrator | 2026-04-20 01:45:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:29.850172 | orchestrator | 2026-04-20 01:45:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:29.850196 | orchestrator | 2026-04-20 01:45:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:32.897323 | orchestrator | 2026-04-20 01:45:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:32.897948 | orchestrator | 2026-04-20 01:45:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:32.898072 | orchestrator | 2026-04-20 01:45:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:35.943365 | orchestrator | 2026-04-20 01:45:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:35.945610 | orchestrator | 2026-04-20 01:45:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:35.945721 | orchestrator | 2026-04-20 01:45:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:38.991121 | orchestrator | 2026-04-20 01:45:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:38.992686 | orchestrator | 2026-04-20 01:45:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:38.992717 | orchestrator | 2026-04-20 01:45:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:42.047926 | orchestrator | 2026-04-20 01:45:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:42.049968 | orchestrator | 2026-04-20 01:45:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:42.050091 | orchestrator | 2026-04-20 01:45:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:45.099747 | orchestrator | 2026-04-20 01:45:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:45.102245 | orchestrator | 2026-04-20 01:45:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:45.102367 | orchestrator | 2026-04-20 01:45:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:48.150721 | orchestrator | 2026-04-20 01:45:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:48.152061 | orchestrator | 2026-04-20 01:45:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:48.152227 | orchestrator | 2026-04-20 01:45:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:51.201700 | orchestrator | 2026-04-20 01:45:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:51.202887 | orchestrator | 2026-04-20 01:45:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:51.203356 | orchestrator | 2026-04-20 01:45:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:54.241958 | orchestrator | 2026-04-20 01:45:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:54.242644 | orchestrator | 2026-04-20 01:45:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:54.242680 | orchestrator | 2026-04-20 01:45:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:45:57.288872 | orchestrator | 2026-04-20 01:45:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:45:57.290372 | orchestrator | 2026-04-20 01:45:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:45:57.290707 | orchestrator | 2026-04-20 01:45:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:00.334363 | orchestrator | 2026-04-20 01:46:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:00.335696 | orchestrator | 2026-04-20 01:46:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:00.335878 | orchestrator | 2026-04-20 01:46:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:03.380768 | orchestrator | 2026-04-20 01:46:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:03.382828 | orchestrator | 2026-04-20 01:46:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:03.382985 | orchestrator | 2026-04-20 01:46:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:06.432981 | orchestrator | 2026-04-20 01:46:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:06.434549 | orchestrator | 2026-04-20 01:46:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:06.434624 | orchestrator | 2026-04-20 01:46:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:09.478607 | orchestrator | 2026-04-20 01:46:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:09.479846 | orchestrator | 2026-04-20 01:46:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:09.479891 | orchestrator | 2026-04-20 01:46:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:12.525434 | orchestrator | 2026-04-20 01:46:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:12.527202 | orchestrator | 2026-04-20 01:46:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:12.527241 | orchestrator | 2026-04-20 01:46:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:15.568900 | orchestrator | 2026-04-20 01:46:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:15.571256 | orchestrator | 2026-04-20 01:46:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:15.571414 | orchestrator | 2026-04-20 01:46:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:18.609706 | orchestrator | 2026-04-20 01:46:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:18.610548 | orchestrator | 2026-04-20 01:46:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:18.610603 | orchestrator | 2026-04-20 01:46:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:21.663632 | orchestrator | 2026-04-20 01:46:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:21.665710 | orchestrator | 2026-04-20 01:46:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:21.665762 | orchestrator | 2026-04-20 01:46:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:24.713714 | orchestrator | 2026-04-20 01:46:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:24.715940 | orchestrator | 2026-04-20 01:46:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:24.716059 | orchestrator | 2026-04-20 01:46:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:27.769241 | orchestrator | 2026-04-20 01:46:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:27.772075 | orchestrator | 2026-04-20 01:46:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:27.772125 | orchestrator | 2026-04-20 01:46:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:30.816719 | orchestrator | 2026-04-20 01:46:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:30.818267 | orchestrator | 2026-04-20 01:46:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:30.818317 | orchestrator | 2026-04-20 01:46:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:33.866678 | orchestrator | 2026-04-20 01:46:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:33.868380 | orchestrator | 2026-04-20 01:46:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:33.868545 | orchestrator | 2026-04-20 01:46:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:36.920511 | orchestrator | 2026-04-20 01:46:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:36.922677 | orchestrator | 2026-04-20 01:46:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:36.922965 | orchestrator | 2026-04-20 01:46:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:39.969663 | orchestrator | 2026-04-20 01:46:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:39.974820 | orchestrator | 2026-04-20 01:46:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:39.974909 | orchestrator | 2026-04-20 01:46:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:43.019854 | orchestrator | 2026-04-20 01:46:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:43.021713 | orchestrator | 2026-04-20 01:46:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:43.021813 | orchestrator | 2026-04-20 01:46:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:46.071862 | orchestrator | 2026-04-20 01:46:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:46.073636 | orchestrator | 2026-04-20 01:46:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:46.074117 | orchestrator | 2026-04-20 01:46:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:49.120040 | orchestrator | 2026-04-20 01:46:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:49.120879 | orchestrator | 2026-04-20 01:46:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:49.120937 | orchestrator | 2026-04-20 01:46:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:52.169300 | orchestrator | 2026-04-20 01:46:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:52.170721 | orchestrator | 2026-04-20 01:46:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:52.170778 | orchestrator | 2026-04-20 01:46:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:55.216954 | orchestrator | 2026-04-20 01:46:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:55.219165 | orchestrator | 2026-04-20 01:46:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:55.219296 | orchestrator | 2026-04-20 01:46:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:46:58.266564 | orchestrator | 2026-04-20 01:46:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:46:58.268864 | orchestrator | 2026-04-20 01:46:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:46:58.268933 | orchestrator | 2026-04-20 01:46:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:01.316772 | orchestrator | 2026-04-20 01:47:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:01.318353 | orchestrator | 2026-04-20 01:47:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:01.318398 | orchestrator | 2026-04-20 01:47:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:04.364176 | orchestrator | 2026-04-20 01:47:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:04.366429 | orchestrator | 2026-04-20 01:47:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:04.366534 | orchestrator | 2026-04-20 01:47:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:07.417054 | orchestrator | 2026-04-20 01:47:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:07.418945 | orchestrator | 2026-04-20 01:47:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:07.419480 | orchestrator | 2026-04-20 01:47:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:10.471448 | orchestrator | 2026-04-20 01:47:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:10.473215 | orchestrator | 2026-04-20 01:47:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:10.473344 | orchestrator | 2026-04-20 01:47:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:13.518769 | orchestrator | 2026-04-20 01:47:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:13.520503 | orchestrator | 2026-04-20 01:47:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:13.520526 | orchestrator | 2026-04-20 01:47:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:16.563002 | orchestrator | 2026-04-20 01:47:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:16.563229 | orchestrator | 2026-04-20 01:47:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:16.563264 | orchestrator | 2026-04-20 01:47:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:19.602424 | orchestrator | 2026-04-20 01:47:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:19.604078 | orchestrator | 2026-04-20 01:47:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:19.604150 | orchestrator | 2026-04-20 01:47:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:22.657952 | orchestrator | 2026-04-20 01:47:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:22.659973 | orchestrator | 2026-04-20 01:47:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:22.660206 | orchestrator | 2026-04-20 01:47:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:25.706346 | orchestrator | 2026-04-20 01:47:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:25.708176 | orchestrator | 2026-04-20 01:47:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:25.708288 | orchestrator | 2026-04-20 01:47:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:28.753068 | orchestrator | 2026-04-20 01:47:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:28.754255 | orchestrator | 2026-04-20 01:47:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:28.754290 | orchestrator | 2026-04-20 01:47:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:31.796744 | orchestrator | 2026-04-20 01:47:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:31.798675 | orchestrator | 2026-04-20 01:47:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:31.798841 | orchestrator | 2026-04-20 01:47:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:34.841237 | orchestrator | 2026-04-20 01:47:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:34.842876 | orchestrator | 2026-04-20 01:47:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:34.842960 | orchestrator | 2026-04-20 01:47:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:37.879309 | orchestrator | 2026-04-20 01:47:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:37.880996 | orchestrator | 2026-04-20 01:47:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:37.881105 | orchestrator | 2026-04-20 01:47:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:40.925622 | orchestrator | 2026-04-20 01:47:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:40.930489 | orchestrator | 2026-04-20 01:47:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:40.930599 | orchestrator | 2026-04-20 01:47:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:43.972945 | orchestrator | 2026-04-20 01:47:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:43.975133 | orchestrator | 2026-04-20 01:47:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:43.975289 | orchestrator | 2026-04-20 01:47:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:47.017378 | orchestrator | 2026-04-20 01:47:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:47.019888 | orchestrator | 2026-04-20 01:47:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:47.020017 | orchestrator | 2026-04-20 01:47:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:50.062579 | orchestrator | 2026-04-20 01:47:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:50.063471 | orchestrator | 2026-04-20 01:47:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:50.064112 | orchestrator | 2026-04-20 01:47:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:53.111813 | orchestrator | 2026-04-20 01:47:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:53.113560 | orchestrator | 2026-04-20 01:47:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:53.113612 | orchestrator | 2026-04-20 01:47:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:56.162730 | orchestrator | 2026-04-20 01:47:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:56.163876 | orchestrator | 2026-04-20 01:47:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:56.163902 | orchestrator | 2026-04-20 01:47:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:47:59.205219 | orchestrator | 2026-04-20 01:47:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:47:59.205596 | orchestrator | 2026-04-20 01:47:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:47:59.205629 | orchestrator | 2026-04-20 01:47:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:02.254424 | orchestrator | 2026-04-20 01:48:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:02.256591 | orchestrator | 2026-04-20 01:48:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:02.256616 | orchestrator | 2026-04-20 01:48:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:05.307596 | orchestrator | 2026-04-20 01:48:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:05.309356 | orchestrator | 2026-04-20 01:48:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:05.309422 | orchestrator | 2026-04-20 01:48:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:08.353986 | orchestrator | 2026-04-20 01:48:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:08.355894 | orchestrator | 2026-04-20 01:48:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:08.355964 | orchestrator | 2026-04-20 01:48:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:11.403970 | orchestrator | 2026-04-20 01:48:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:11.406241 | orchestrator | 2026-04-20 01:48:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:11.406350 | orchestrator | 2026-04-20 01:48:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:14.448609 | orchestrator | 2026-04-20 01:48:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:14.450651 | orchestrator | 2026-04-20 01:48:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:14.450804 | orchestrator | 2026-04-20 01:48:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:17.493665 | orchestrator | 2026-04-20 01:48:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:17.496584 | orchestrator | 2026-04-20 01:48:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:17.496666 | orchestrator | 2026-04-20 01:48:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:20.536618 | orchestrator | 2026-04-20 01:48:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:20.538306 | orchestrator | 2026-04-20 01:48:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:20.538467 | orchestrator | 2026-04-20 01:48:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:23.579375 | orchestrator | 2026-04-20 01:48:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:23.580434 | orchestrator | 2026-04-20 01:48:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:23.580479 | orchestrator | 2026-04-20 01:48:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:26.625512 | orchestrator | 2026-04-20 01:48:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:26.627775 | orchestrator | 2026-04-20 01:48:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:26.627857 | orchestrator | 2026-04-20 01:48:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:29.679234 | orchestrator | 2026-04-20 01:48:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:29.680540 | orchestrator | 2026-04-20 01:48:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:29.680751 | orchestrator | 2026-04-20 01:48:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:32.727746 | orchestrator | 2026-04-20 01:48:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:32.732125 | orchestrator | 2026-04-20 01:48:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:32.732187 | orchestrator | 2026-04-20 01:48:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:35.780980 | orchestrator | 2026-04-20 01:48:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:35.784793 | orchestrator | 2026-04-20 01:48:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:35.784886 | orchestrator | 2026-04-20 01:48:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:38.834522 | orchestrator | 2026-04-20 01:48:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:38.837567 | orchestrator | 2026-04-20 01:48:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:38.837664 | orchestrator | 2026-04-20 01:48:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:41.885121 | orchestrator | 2026-04-20 01:48:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:41.886571 | orchestrator | 2026-04-20 01:48:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:41.886726 | orchestrator | 2026-04-20 01:48:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:44.937841 | orchestrator | 2026-04-20 01:48:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:44.940119 | orchestrator | 2026-04-20 01:48:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:44.940162 | orchestrator | 2026-04-20 01:48:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:47.990118 | orchestrator | 2026-04-20 01:48:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:47.991461 | orchestrator | 2026-04-20 01:48:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:47.991853 | orchestrator | 2026-04-20 01:48:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:51.056946 | orchestrator | 2026-04-20 01:48:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:51.058439 | orchestrator | 2026-04-20 01:48:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:51.058586 | orchestrator | 2026-04-20 01:48:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:54.101320 | orchestrator | 2026-04-20 01:48:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:54.103568 | orchestrator | 2026-04-20 01:48:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:54.103629 | orchestrator | 2026-04-20 01:48:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:48:57.146444 | orchestrator | 2026-04-20 01:48:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:48:57.148042 | orchestrator | 2026-04-20 01:48:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:48:57.148353 | orchestrator | 2026-04-20 01:48:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:00.192325 | orchestrator | 2026-04-20 01:49:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:00.193938 | orchestrator | 2026-04-20 01:49:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:00.193974 | orchestrator | 2026-04-20 01:49:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:03.241932 | orchestrator | 2026-04-20 01:49:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:03.243805 | orchestrator | 2026-04-20 01:49:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:03.243878 | orchestrator | 2026-04-20 01:49:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:06.291028 | orchestrator | 2026-04-20 01:49:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:06.292643 | orchestrator | 2026-04-20 01:49:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:06.292812 | orchestrator | 2026-04-20 01:49:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:09.342072 | orchestrator | 2026-04-20 01:49:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:09.344301 | orchestrator | 2026-04-20 01:49:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:09.344341 | orchestrator | 2026-04-20 01:49:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:12.389235 | orchestrator | 2026-04-20 01:49:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:12.392264 | orchestrator | 2026-04-20 01:49:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:12.392313 | orchestrator | 2026-04-20 01:49:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:15.440041 | orchestrator | 2026-04-20 01:49:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:15.441723 | orchestrator | 2026-04-20 01:49:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:15.441760 | orchestrator | 2026-04-20 01:49:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:18.488242 | orchestrator | 2026-04-20 01:49:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:18.489296 | orchestrator | 2026-04-20 01:49:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:18.489450 | orchestrator | 2026-04-20 01:49:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:21.536126 | orchestrator | 2026-04-20 01:49:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:21.540215 | orchestrator | 2026-04-20 01:49:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:21.540304 | orchestrator | 2026-04-20 01:49:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:24.587905 | orchestrator | 2026-04-20 01:49:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:24.589204 | orchestrator | 2026-04-20 01:49:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:24.589403 | orchestrator | 2026-04-20 01:49:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:27.636540 | orchestrator | 2026-04-20 01:49:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:27.636856 | orchestrator | 2026-04-20 01:49:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:27.637236 | orchestrator | 2026-04-20 01:49:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:30.681040 | orchestrator | 2026-04-20 01:49:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:30.684808 | orchestrator | 2026-04-20 01:49:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:30.684864 | orchestrator | 2026-04-20 01:49:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:33.729531 | orchestrator | 2026-04-20 01:49:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:33.731489 | orchestrator | 2026-04-20 01:49:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:33.731854 | orchestrator | 2026-04-20 01:49:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:36.772878 | orchestrator | 2026-04-20 01:49:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:36.775707 | orchestrator | 2026-04-20 01:49:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:36.775791 | orchestrator | 2026-04-20 01:49:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:39.818637 | orchestrator | 2026-04-20 01:49:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:39.818859 | orchestrator | 2026-04-20 01:49:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:39.818885 | orchestrator | 2026-04-20 01:49:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:42.863610 | orchestrator | 2026-04-20 01:49:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:42.866441 | orchestrator | 2026-04-20 01:49:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:42.866503 | orchestrator | 2026-04-20 01:49:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:45.915842 | orchestrator | 2026-04-20 01:49:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:45.920044 | orchestrator | 2026-04-20 01:49:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:45.920180 | orchestrator | 2026-04-20 01:49:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:48.974998 | orchestrator | 2026-04-20 01:49:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:48.977446 | orchestrator | 2026-04-20 01:49:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:48.977533 | orchestrator | 2026-04-20 01:49:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:52.020505 | orchestrator | 2026-04-20 01:49:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:52.023668 | orchestrator | 2026-04-20 01:49:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:52.023724 | orchestrator | 2026-04-20 01:49:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:55.069246 | orchestrator | 2026-04-20 01:49:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:55.072741 | orchestrator | 2026-04-20 01:49:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:55.072867 | orchestrator | 2026-04-20 01:49:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:49:58.113571 | orchestrator | 2026-04-20 01:49:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:49:58.114541 | orchestrator | 2026-04-20 01:49:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:49:58.114592 | orchestrator | 2026-04-20 01:49:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:01.158758 | orchestrator | 2026-04-20 01:50:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:01.160450 | orchestrator | 2026-04-20 01:50:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:01.160984 | orchestrator | 2026-04-20 01:50:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:04.207784 | orchestrator | 2026-04-20 01:50:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:04.209021 | orchestrator | 2026-04-20 01:50:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:04.209150 | orchestrator | 2026-04-20 01:50:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:07.256943 | orchestrator | 2026-04-20 01:50:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:07.259101 | orchestrator | 2026-04-20 01:50:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:07.259292 | orchestrator | 2026-04-20 01:50:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:10.303198 | orchestrator | 2026-04-20 01:50:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:10.306429 | orchestrator | 2026-04-20 01:50:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:10.306582 | orchestrator | 2026-04-20 01:50:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:13.361825 | orchestrator | 2026-04-20 01:50:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:13.361954 | orchestrator | 2026-04-20 01:50:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:13.361966 | orchestrator | 2026-04-20 01:50:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:16.405161 | orchestrator | 2026-04-20 01:50:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:16.406599 | orchestrator | 2026-04-20 01:50:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:16.406671 | orchestrator | 2026-04-20 01:50:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:19.452585 | orchestrator | 2026-04-20 01:50:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:19.454462 | orchestrator | 2026-04-20 01:50:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:19.454566 | orchestrator | 2026-04-20 01:50:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:22.499836 | orchestrator | 2026-04-20 01:50:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:22.501535 | orchestrator | 2026-04-20 01:50:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:22.501576 | orchestrator | 2026-04-20 01:50:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:25.551371 | orchestrator | 2026-04-20 01:50:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:25.553664 | orchestrator | 2026-04-20 01:50:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:25.553728 | orchestrator | 2026-04-20 01:50:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:28.602753 | orchestrator | 2026-04-20 01:50:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:28.604233 | orchestrator | 2026-04-20 01:50:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:28.604264 | orchestrator | 2026-04-20 01:50:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:31.651887 | orchestrator | 2026-04-20 01:50:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:31.653928 | orchestrator | 2026-04-20 01:50:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:31.654096 | orchestrator | 2026-04-20 01:50:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:34.699789 | orchestrator | 2026-04-20 01:50:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:34.700741 | orchestrator | 2026-04-20 01:50:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:34.700816 | orchestrator | 2026-04-20 01:50:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:37.745070 | orchestrator | 2026-04-20 01:50:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:37.746394 | orchestrator | 2026-04-20 01:50:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:37.746438 | orchestrator | 2026-04-20 01:50:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:40.790944 | orchestrator | 2026-04-20 01:50:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:40.792692 | orchestrator | 2026-04-20 01:50:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:40.792794 | orchestrator | 2026-04-20 01:50:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:43.837039 | orchestrator | 2026-04-20 01:50:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:43.839052 | orchestrator | 2026-04-20 01:50:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:43.839266 | orchestrator | 2026-04-20 01:50:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:46.889048 | orchestrator | 2026-04-20 01:50:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:46.890946 | orchestrator | 2026-04-20 01:50:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:46.891075 | orchestrator | 2026-04-20 01:50:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:49.936331 | orchestrator | 2026-04-20 01:50:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:49.939199 | orchestrator | 2026-04-20 01:50:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:49.939281 | orchestrator | 2026-04-20 01:50:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:52.985005 | orchestrator | 2026-04-20 01:50:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:52.986417 | orchestrator | 2026-04-20 01:50:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:52.986496 | orchestrator | 2026-04-20 01:50:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:56.033723 | orchestrator | 2026-04-20 01:50:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:56.034831 | orchestrator | 2026-04-20 01:50:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:56.034873 | orchestrator | 2026-04-20 01:50:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:50:59.079401 | orchestrator | 2026-04-20 01:50:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:50:59.080431 | orchestrator | 2026-04-20 01:50:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:50:59.080473 | orchestrator | 2026-04-20 01:50:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:51:02.125873 | orchestrator | 2026-04-20 01:51:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:02.241674 | orchestrator | 2026-04-20 01:53:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:02.241795 | orchestrator | 2026-04-20 01:53:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:05.285470 | orchestrator | 2026-04-20 01:53:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:05.287065 | orchestrator | 2026-04-20 01:53:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:05.287145 | orchestrator | 2026-04-20 01:53:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:08.331556 | orchestrator | 2026-04-20 01:53:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:08.333321 | orchestrator | 2026-04-20 01:53:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:08.333404 | orchestrator | 2026-04-20 01:53:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:11.374408 | orchestrator | 2026-04-20 01:53:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:11.375864 | orchestrator | 2026-04-20 01:53:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:11.375929 | orchestrator | 2026-04-20 01:53:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:14.423893 | orchestrator | 2026-04-20 01:53:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:14.424783 | orchestrator | 2026-04-20 01:53:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:14.424949 | orchestrator | 2026-04-20 01:53:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:17.474444 | orchestrator | 2026-04-20 01:53:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:17.476296 | orchestrator | 2026-04-20 01:53:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:17.476379 | orchestrator | 2026-04-20 01:53:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:20.518763 | orchestrator | 2026-04-20 01:53:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:20.520384 | orchestrator | 2026-04-20 01:53:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:20.520470 | orchestrator | 2026-04-20 01:53:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:23.563016 | orchestrator | 2026-04-20 01:53:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:23.565442 | orchestrator | 2026-04-20 01:53:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:23.565520 | orchestrator | 2026-04-20 01:53:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:26.615678 | orchestrator | 2026-04-20 01:53:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:26.616995 | orchestrator | 2026-04-20 01:53:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:26.617057 | orchestrator | 2026-04-20 01:53:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:29.662387 | orchestrator | 2026-04-20 01:53:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:29.664090 | orchestrator | 2026-04-20 01:53:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:29.664127 | orchestrator | 2026-04-20 01:53:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:32.715763 | orchestrator | 2026-04-20 01:53:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:32.717507 | orchestrator | 2026-04-20 01:53:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:32.717977 | orchestrator | 2026-04-20 01:53:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:35.764206 | orchestrator | 2026-04-20 01:53:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:35.765421 | orchestrator | 2026-04-20 01:53:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:35.765470 | orchestrator | 2026-04-20 01:53:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:38.806757 | orchestrator | 2026-04-20 01:53:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:38.807868 | orchestrator | 2026-04-20 01:53:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:38.807918 | orchestrator | 2026-04-20 01:53:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:41.854466 | orchestrator | 2026-04-20 01:53:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:41.856168 | orchestrator | 2026-04-20 01:53:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:41.856261 | orchestrator | 2026-04-20 01:53:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:44.898851 | orchestrator | 2026-04-20 01:53:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:44.901371 | orchestrator | 2026-04-20 01:53:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:44.901437 | orchestrator | 2026-04-20 01:53:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:47.951921 | orchestrator | 2026-04-20 01:53:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:47.952991 | orchestrator | 2026-04-20 01:53:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:47.953092 | orchestrator | 2026-04-20 01:53:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:50.994412 | orchestrator | 2026-04-20 01:53:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:50.994817 | orchestrator | 2026-04-20 01:53:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:50.994850 | orchestrator | 2026-04-20 01:53:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:54.042444 | orchestrator | 2026-04-20 01:53:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:54.044626 | orchestrator | 2026-04-20 01:53:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:54.044685 | orchestrator | 2026-04-20 01:53:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:53:57.092535 | orchestrator | 2026-04-20 01:53:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:53:57.094425 | orchestrator | 2026-04-20 01:53:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:53:57.094471 | orchestrator | 2026-04-20 01:53:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:00.137720 | orchestrator | 2026-04-20 01:54:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:00.138932 | orchestrator | 2026-04-20 01:54:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:00.139031 | orchestrator | 2026-04-20 01:54:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:03.183194 | orchestrator | 2026-04-20 01:54:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:03.191900 | orchestrator | 2026-04-20 01:54:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:03.191950 | orchestrator | 2026-04-20 01:54:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:06.230729 | orchestrator | 2026-04-20 01:54:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:06.232616 | orchestrator | 2026-04-20 01:54:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:06.232771 | orchestrator | 2026-04-20 01:54:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:09.272458 | orchestrator | 2026-04-20 01:54:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:09.273134 | orchestrator | 2026-04-20 01:54:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:09.273165 | orchestrator | 2026-04-20 01:54:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:12.317640 | orchestrator | 2026-04-20 01:54:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:12.320180 | orchestrator | 2026-04-20 01:54:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:12.320263 | orchestrator | 2026-04-20 01:54:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:15.371985 | orchestrator | 2026-04-20 01:54:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:15.373283 | orchestrator | 2026-04-20 01:54:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:15.373349 | orchestrator | 2026-04-20 01:54:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:18.426826 | orchestrator | 2026-04-20 01:54:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:18.429420 | orchestrator | 2026-04-20 01:54:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:18.429643 | orchestrator | 2026-04-20 01:54:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:21.474864 | orchestrator | 2026-04-20 01:54:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:21.478493 | orchestrator | 2026-04-20 01:54:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:21.478585 | orchestrator | 2026-04-20 01:54:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:24.521595 | orchestrator | 2026-04-20 01:54:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:24.523078 | orchestrator | 2026-04-20 01:54:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:24.523140 | orchestrator | 2026-04-20 01:54:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:27.573017 | orchestrator | 2026-04-20 01:54:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:27.575064 | orchestrator | 2026-04-20 01:54:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:27.575116 | orchestrator | 2026-04-20 01:54:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:30.624861 | orchestrator | 2026-04-20 01:54:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:30.626215 | orchestrator | 2026-04-20 01:54:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:30.626248 | orchestrator | 2026-04-20 01:54:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:33.677792 | orchestrator | 2026-04-20 01:54:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:33.680194 | orchestrator | 2026-04-20 01:54:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:33.680283 | orchestrator | 2026-04-20 01:54:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:36.724319 | orchestrator | 2026-04-20 01:54:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:36.725669 | orchestrator | 2026-04-20 01:54:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:36.725725 | orchestrator | 2026-04-20 01:54:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:39.777558 | orchestrator | 2026-04-20 01:54:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:39.778653 | orchestrator | 2026-04-20 01:54:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:39.778704 | orchestrator | 2026-04-20 01:54:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:42.828057 | orchestrator | 2026-04-20 01:54:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:42.829402 | orchestrator | 2026-04-20 01:54:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:42.829452 | orchestrator | 2026-04-20 01:54:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:45.878813 | orchestrator | 2026-04-20 01:54:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:45.879806 | orchestrator | 2026-04-20 01:54:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:45.879998 | orchestrator | 2026-04-20 01:54:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:48.931197 | orchestrator | 2026-04-20 01:54:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:48.933546 | orchestrator | 2026-04-20 01:54:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:48.933681 | orchestrator | 2026-04-20 01:54:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:51.974243 | orchestrator | 2026-04-20 01:54:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:51.974574 | orchestrator | 2026-04-20 01:54:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:51.974653 | orchestrator | 2026-04-20 01:54:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:55.025636 | orchestrator | 2026-04-20 01:54:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:55.027689 | orchestrator | 2026-04-20 01:54:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:55.027737 | orchestrator | 2026-04-20 01:54:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:54:58.074348 | orchestrator | 2026-04-20 01:54:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:54:58.075721 | orchestrator | 2026-04-20 01:54:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:54:58.075742 | orchestrator | 2026-04-20 01:54:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:01.122647 | orchestrator | 2026-04-20 01:55:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:01.123975 | orchestrator | 2026-04-20 01:55:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:01.124177 | orchestrator | 2026-04-20 01:55:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:04.170905 | orchestrator | 2026-04-20 01:55:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:04.172300 | orchestrator | 2026-04-20 01:55:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:04.172316 | orchestrator | 2026-04-20 01:55:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:07.215580 | orchestrator | 2026-04-20 01:55:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:07.217002 | orchestrator | 2026-04-20 01:55:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:07.217048 | orchestrator | 2026-04-20 01:55:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:10.264362 | orchestrator | 2026-04-20 01:55:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:10.265532 | orchestrator | 2026-04-20 01:55:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:10.265578 | orchestrator | 2026-04-20 01:55:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:13.312685 | orchestrator | 2026-04-20 01:55:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:13.314315 | orchestrator | 2026-04-20 01:55:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:13.314395 | orchestrator | 2026-04-20 01:55:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:16.359328 | orchestrator | 2026-04-20 01:55:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:16.360223 | orchestrator | 2026-04-20 01:55:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:16.360282 | orchestrator | 2026-04-20 01:55:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:19.402503 | orchestrator | 2026-04-20 01:55:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:19.403944 | orchestrator | 2026-04-20 01:55:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:19.404035 | orchestrator | 2026-04-20 01:55:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:22.446362 | orchestrator | 2026-04-20 01:55:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:22.447769 | orchestrator | 2026-04-20 01:55:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:22.447847 | orchestrator | 2026-04-20 01:55:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:25.499026 | orchestrator | 2026-04-20 01:55:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:25.500792 | orchestrator | 2026-04-20 01:55:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:25.500837 | orchestrator | 2026-04-20 01:55:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:28.552057 | orchestrator | 2026-04-20 01:55:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:28.553905 | orchestrator | 2026-04-20 01:55:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:28.553973 | orchestrator | 2026-04-20 01:55:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:31.609061 | orchestrator | 2026-04-20 01:55:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:31.610659 | orchestrator | 2026-04-20 01:55:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:31.610710 | orchestrator | 2026-04-20 01:55:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:34.660787 | orchestrator | 2026-04-20 01:55:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:34.662536 | orchestrator | 2026-04-20 01:55:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:34.662717 | orchestrator | 2026-04-20 01:55:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:37.709515 | orchestrator | 2026-04-20 01:55:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:37.711050 | orchestrator | 2026-04-20 01:55:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:37.711154 | orchestrator | 2026-04-20 01:55:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:40.760501 | orchestrator | 2026-04-20 01:55:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:40.764502 | orchestrator | 2026-04-20 01:55:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:40.764586 | orchestrator | 2026-04-20 01:55:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:43.813019 | orchestrator | 2026-04-20 01:55:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:43.814325 | orchestrator | 2026-04-20 01:55:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:43.814379 | orchestrator | 2026-04-20 01:55:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:46.863574 | orchestrator | 2026-04-20 01:55:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:46.866284 | orchestrator | 2026-04-20 01:55:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:46.866412 | orchestrator | 2026-04-20 01:55:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:49.911218 | orchestrator | 2026-04-20 01:55:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:49.913246 | orchestrator | 2026-04-20 01:55:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:49.913365 | orchestrator | 2026-04-20 01:55:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:52.961063 | orchestrator | 2026-04-20 01:55:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:52.963356 | orchestrator | 2026-04-20 01:55:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:52.963416 | orchestrator | 2026-04-20 01:55:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:56.012036 | orchestrator | 2026-04-20 01:55:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:56.013283 | orchestrator | 2026-04-20 01:55:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:56.013363 | orchestrator | 2026-04-20 01:55:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:55:59.055785 | orchestrator | 2026-04-20 01:55:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:55:59.056147 | orchestrator | 2026-04-20 01:55:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:55:59.056283 | orchestrator | 2026-04-20 01:55:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:02.102729 | orchestrator | 2026-04-20 01:56:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:02.103663 | orchestrator | 2026-04-20 01:56:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:02.103711 | orchestrator | 2026-04-20 01:56:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:05.153326 | orchestrator | 2026-04-20 01:56:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:05.154548 | orchestrator | 2026-04-20 01:56:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:05.154618 | orchestrator | 2026-04-20 01:56:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:08.197978 | orchestrator | 2026-04-20 01:56:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:08.199102 | orchestrator | 2026-04-20 01:56:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:08.199361 | orchestrator | 2026-04-20 01:56:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:11.247063 | orchestrator | 2026-04-20 01:56:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:11.248693 | orchestrator | 2026-04-20 01:56:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:11.248745 | orchestrator | 2026-04-20 01:56:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:14.295964 | orchestrator | 2026-04-20 01:56:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:14.298191 | orchestrator | 2026-04-20 01:56:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:14.298249 | orchestrator | 2026-04-20 01:56:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:17.342855 | orchestrator | 2026-04-20 01:56:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:17.343948 | orchestrator | 2026-04-20 01:56:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:17.344074 | orchestrator | 2026-04-20 01:56:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:20.392723 | orchestrator | 2026-04-20 01:56:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:20.393735 | orchestrator | 2026-04-20 01:56:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:20.393776 | orchestrator | 2026-04-20 01:56:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:23.441495 | orchestrator | 2026-04-20 01:56:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:23.443991 | orchestrator | 2026-04-20 01:56:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:23.444070 | orchestrator | 2026-04-20 01:56:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:26.482338 | orchestrator | 2026-04-20 01:56:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:26.483829 | orchestrator | 2026-04-20 01:56:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:26.483952 | orchestrator | 2026-04-20 01:56:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:29.528786 | orchestrator | 2026-04-20 01:56:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:29.530194 | orchestrator | 2026-04-20 01:56:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:29.530254 | orchestrator | 2026-04-20 01:56:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:32.574416 | orchestrator | 2026-04-20 01:56:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:32.576284 | orchestrator | 2026-04-20 01:56:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:32.576363 | orchestrator | 2026-04-20 01:56:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:35.624472 | orchestrator | 2026-04-20 01:56:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:35.625010 | orchestrator | 2026-04-20 01:56:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:35.625050 | orchestrator | 2026-04-20 01:56:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:38.668708 | orchestrator | 2026-04-20 01:56:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:38.670308 | orchestrator | 2026-04-20 01:56:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:38.670377 | orchestrator | 2026-04-20 01:56:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:41.710918 | orchestrator | 2026-04-20 01:56:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:41.712797 | orchestrator | 2026-04-20 01:56:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:41.712861 | orchestrator | 2026-04-20 01:56:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:44.760004 | orchestrator | 2026-04-20 01:56:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:44.761481 | orchestrator | 2026-04-20 01:56:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:44.761885 | orchestrator | 2026-04-20 01:56:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:47.809505 | orchestrator | 2026-04-20 01:56:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:47.811412 | orchestrator | 2026-04-20 01:56:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:47.811469 | orchestrator | 2026-04-20 01:56:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:50.860982 | orchestrator | 2026-04-20 01:56:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:50.863179 | orchestrator | 2026-04-20 01:56:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:50.863272 | orchestrator | 2026-04-20 01:56:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:53.910961 | orchestrator | 2026-04-20 01:56:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:53.912687 | orchestrator | 2026-04-20 01:56:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:53.912789 | orchestrator | 2026-04-20 01:56:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:56:56.962139 | orchestrator | 2026-04-20 01:56:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:56:56.963276 | orchestrator | 2026-04-20 01:56:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:56:56.963315 | orchestrator | 2026-04-20 01:56:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:00.019548 | orchestrator | 2026-04-20 01:57:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:00.021633 | orchestrator | 2026-04-20 01:57:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:00.021678 | orchestrator | 2026-04-20 01:57:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:03.069489 | orchestrator | 2026-04-20 01:57:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:03.070787 | orchestrator | 2026-04-20 01:57:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:03.070826 | orchestrator | 2026-04-20 01:57:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:06.121444 | orchestrator | 2026-04-20 01:57:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:06.123057 | orchestrator | 2026-04-20 01:57:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:06.123111 | orchestrator | 2026-04-20 01:57:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:09.179057 | orchestrator | 2026-04-20 01:57:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:09.180802 | orchestrator | 2026-04-20 01:57:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:09.180859 | orchestrator | 2026-04-20 01:57:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:12.227518 | orchestrator | 2026-04-20 01:57:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:12.229017 | orchestrator | 2026-04-20 01:57:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:12.229126 | orchestrator | 2026-04-20 01:57:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:15.277757 | orchestrator | 2026-04-20 01:57:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:15.279395 | orchestrator | 2026-04-20 01:57:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:15.279518 | orchestrator | 2026-04-20 01:57:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:18.325695 | orchestrator | 2026-04-20 01:57:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:18.328700 | orchestrator | 2026-04-20 01:57:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:18.328757 | orchestrator | 2026-04-20 01:57:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:21.383448 | orchestrator | 2026-04-20 01:57:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:21.383859 | orchestrator | 2026-04-20 01:57:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:21.383937 | orchestrator | 2026-04-20 01:57:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:24.437418 | orchestrator | 2026-04-20 01:57:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:24.438324 | orchestrator | 2026-04-20 01:57:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:24.438398 | orchestrator | 2026-04-20 01:57:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:27.485129 | orchestrator | 2026-04-20 01:57:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:27.486371 | orchestrator | 2026-04-20 01:57:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:27.486399 | orchestrator | 2026-04-20 01:57:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:30.535812 | orchestrator | 2026-04-20 01:57:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:30.537100 | orchestrator | 2026-04-20 01:57:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:30.537116 | orchestrator | 2026-04-20 01:57:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:33.584020 | orchestrator | 2026-04-20 01:57:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:33.584590 | orchestrator | 2026-04-20 01:57:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:33.584956 | orchestrator | 2026-04-20 01:57:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:36.631085 | orchestrator | 2026-04-20 01:57:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:36.632291 | orchestrator | 2026-04-20 01:57:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:36.632336 | orchestrator | 2026-04-20 01:57:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:39.682234 | orchestrator | 2026-04-20 01:57:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:39.684600 | orchestrator | 2026-04-20 01:57:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:39.684707 | orchestrator | 2026-04-20 01:57:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:42.728096 | orchestrator | 2026-04-20 01:57:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:42.729621 | orchestrator | 2026-04-20 01:57:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:42.729732 | orchestrator | 2026-04-20 01:57:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:45.775086 | orchestrator | 2026-04-20 01:57:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:45.776265 | orchestrator | 2026-04-20 01:57:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:45.776308 | orchestrator | 2026-04-20 01:57:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:48.821190 | orchestrator | 2026-04-20 01:57:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:48.821956 | orchestrator | 2026-04-20 01:57:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:48.822095 | orchestrator | 2026-04-20 01:57:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:51.866647 | orchestrator | 2026-04-20 01:57:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:51.868113 | orchestrator | 2026-04-20 01:57:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:51.868188 | orchestrator | 2026-04-20 01:57:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:54.917604 | orchestrator | 2026-04-20 01:57:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:54.919429 | orchestrator | 2026-04-20 01:57:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:54.919571 | orchestrator | 2026-04-20 01:57:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:57:57.966480 | orchestrator | 2026-04-20 01:57:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:57:57.966861 | orchestrator | 2026-04-20 01:57:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:57:57.966935 | orchestrator | 2026-04-20 01:57:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:01.013043 | orchestrator | 2026-04-20 01:58:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:01.014855 | orchestrator | 2026-04-20 01:58:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:01.014914 | orchestrator | 2026-04-20 01:58:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:04.062237 | orchestrator | 2026-04-20 01:58:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:04.065531 | orchestrator | 2026-04-20 01:58:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:04.065610 | orchestrator | 2026-04-20 01:58:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:07.114826 | orchestrator | 2026-04-20 01:58:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:07.117220 | orchestrator | 2026-04-20 01:58:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:07.117272 | orchestrator | 2026-04-20 01:58:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:10.163517 | orchestrator | 2026-04-20 01:58:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:10.163625 | orchestrator | 2026-04-20 01:58:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:10.163633 | orchestrator | 2026-04-20 01:58:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:13.211552 | orchestrator | 2026-04-20 01:58:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:13.213517 | orchestrator | 2026-04-20 01:58:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:13.213567 | orchestrator | 2026-04-20 01:58:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:16.258263 | orchestrator | 2026-04-20 01:58:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:16.259626 | orchestrator | 2026-04-20 01:58:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:16.259687 | orchestrator | 2026-04-20 01:58:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:19.310186 | orchestrator | 2026-04-20 01:58:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:19.311915 | orchestrator | 2026-04-20 01:58:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:19.311985 | orchestrator | 2026-04-20 01:58:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:22.357806 | orchestrator | 2026-04-20 01:58:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:22.359233 | orchestrator | 2026-04-20 01:58:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:22.360286 | orchestrator | 2026-04-20 01:58:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:25.404441 | orchestrator | 2026-04-20 01:58:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:25.405005 | orchestrator | 2026-04-20 01:58:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:25.405307 | orchestrator | 2026-04-20 01:58:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:28.449399 | orchestrator | 2026-04-20 01:58:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:28.451060 | orchestrator | 2026-04-20 01:58:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:28.451159 | orchestrator | 2026-04-20 01:58:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:31.491936 | orchestrator | 2026-04-20 01:58:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:31.493374 | orchestrator | 2026-04-20 01:58:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:31.493425 | orchestrator | 2026-04-20 01:58:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:34.542582 | orchestrator | 2026-04-20 01:58:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:34.544721 | orchestrator | 2026-04-20 01:58:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:34.544905 | orchestrator | 2026-04-20 01:58:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:37.590297 | orchestrator | 2026-04-20 01:58:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:37.591333 | orchestrator | 2026-04-20 01:58:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:37.591385 | orchestrator | 2026-04-20 01:58:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:40.640544 | orchestrator | 2026-04-20 01:58:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:40.641874 | orchestrator | 2026-04-20 01:58:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:40.642167 | orchestrator | 2026-04-20 01:58:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:43.688127 | orchestrator | 2026-04-20 01:58:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:43.693143 | orchestrator | 2026-04-20 01:58:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:43.693224 | orchestrator | 2026-04-20 01:58:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:46.741239 | orchestrator | 2026-04-20 01:58:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:46.746571 | orchestrator | 2026-04-20 01:58:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:46.746662 | orchestrator | 2026-04-20 01:58:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:49.790461 | orchestrator | 2026-04-20 01:58:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:49.791910 | orchestrator | 2026-04-20 01:58:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:49.791973 | orchestrator | 2026-04-20 01:58:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:52.839559 | orchestrator | 2026-04-20 01:58:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:52.841393 | orchestrator | 2026-04-20 01:58:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:52.841591 | orchestrator | 2026-04-20 01:58:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:55.884262 | orchestrator | 2026-04-20 01:58:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:55.886225 | orchestrator | 2026-04-20 01:58:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:55.886329 | orchestrator | 2026-04-20 01:58:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:58:58.929894 | orchestrator | 2026-04-20 01:58:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:58:58.931865 | orchestrator | 2026-04-20 01:58:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:58:58.931926 | orchestrator | 2026-04-20 01:58:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:01.982295 | orchestrator | 2026-04-20 01:59:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:01.984113 | orchestrator | 2026-04-20 01:59:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:01.984170 | orchestrator | 2026-04-20 01:59:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:05.035489 | orchestrator | 2026-04-20 01:59:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:05.038472 | orchestrator | 2026-04-20 01:59:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:05.038573 | orchestrator | 2026-04-20 01:59:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:08.088549 | orchestrator | 2026-04-20 01:59:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:08.092079 | orchestrator | 2026-04-20 01:59:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:08.092143 | orchestrator | 2026-04-20 01:59:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:11.135938 | orchestrator | 2026-04-20 01:59:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:11.138278 | orchestrator | 2026-04-20 01:59:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:11.138344 | orchestrator | 2026-04-20 01:59:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:14.189782 | orchestrator | 2026-04-20 01:59:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:14.190734 | orchestrator | 2026-04-20 01:59:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:14.190762 | orchestrator | 2026-04-20 01:59:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:17.240692 | orchestrator | 2026-04-20 01:59:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:17.241697 | orchestrator | 2026-04-20 01:59:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:17.241729 | orchestrator | 2026-04-20 01:59:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:20.287315 | orchestrator | 2026-04-20 01:59:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:20.288366 | orchestrator | 2026-04-20 01:59:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:20.288463 | orchestrator | 2026-04-20 01:59:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:23.332235 | orchestrator | 2026-04-20 01:59:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:23.334685 | orchestrator | 2026-04-20 01:59:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:23.335083 | orchestrator | 2026-04-20 01:59:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:26.378496 | orchestrator | 2026-04-20 01:59:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:26.379750 | orchestrator | 2026-04-20 01:59:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:26.379906 | orchestrator | 2026-04-20 01:59:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:29.426377 | orchestrator | 2026-04-20 01:59:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:29.428280 | orchestrator | 2026-04-20 01:59:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:29.428362 | orchestrator | 2026-04-20 01:59:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:32.477744 | orchestrator | 2026-04-20 01:59:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:32.480030 | orchestrator | 2026-04-20 01:59:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:32.480100 | orchestrator | 2026-04-20 01:59:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:35.529097 | orchestrator | 2026-04-20 01:59:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:35.530484 | orchestrator | 2026-04-20 01:59:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:35.530543 | orchestrator | 2026-04-20 01:59:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:38.579715 | orchestrator | 2026-04-20 01:59:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:38.581412 | orchestrator | 2026-04-20 01:59:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:38.581443 | orchestrator | 2026-04-20 01:59:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:41.626556 | orchestrator | 2026-04-20 01:59:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:41.628935 | orchestrator | 2026-04-20 01:59:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:41.628984 | orchestrator | 2026-04-20 01:59:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:44.676951 | orchestrator | 2026-04-20 01:59:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:44.678355 | orchestrator | 2026-04-20 01:59:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:44.678413 | orchestrator | 2026-04-20 01:59:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:47.725213 | orchestrator | 2026-04-20 01:59:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:47.727284 | orchestrator | 2026-04-20 01:59:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:47.727508 | orchestrator | 2026-04-20 01:59:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:50.766187 | orchestrator | 2026-04-20 01:59:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:50.767845 | orchestrator | 2026-04-20 01:59:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:50.767948 | orchestrator | 2026-04-20 01:59:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:53.824013 | orchestrator | 2026-04-20 01:59:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:53.826374 | orchestrator | 2026-04-20 01:59:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:53.826426 | orchestrator | 2026-04-20 01:59:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:56.866133 | orchestrator | 2026-04-20 01:59:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:56.868446 | orchestrator | 2026-04-20 01:59:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:56.868526 | orchestrator | 2026-04-20 01:59:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 01:59:59.914122 | orchestrator | 2026-04-20 01:59:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 01:59:59.915090 | orchestrator | 2026-04-20 01:59:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 01:59:59.915193 | orchestrator | 2026-04-20 01:59:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:02.962982 | orchestrator | 2026-04-20 02:00:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:02.964248 | orchestrator | 2026-04-20 02:00:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:02.964315 | orchestrator | 2026-04-20 02:00:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:06.008325 | orchestrator | 2026-04-20 02:00:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:06.012145 | orchestrator | 2026-04-20 02:00:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:06.012232 | orchestrator | 2026-04-20 02:00:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:09.058688 | orchestrator | 2026-04-20 02:00:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:09.062195 | orchestrator | 2026-04-20 02:00:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:09.062315 | orchestrator | 2026-04-20 02:00:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:12.109781 | orchestrator | 2026-04-20 02:00:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:12.110745 | orchestrator | 2026-04-20 02:00:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:12.110981 | orchestrator | 2026-04-20 02:00:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:15.160175 | orchestrator | 2026-04-20 02:00:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:15.161471 | orchestrator | 2026-04-20 02:00:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:15.161529 | orchestrator | 2026-04-20 02:00:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:18.205350 | orchestrator | 2026-04-20 02:00:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:18.209588 | orchestrator | 2026-04-20 02:00:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:18.209668 | orchestrator | 2026-04-20 02:00:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:21.252050 | orchestrator | 2026-04-20 02:00:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:21.254290 | orchestrator | 2026-04-20 02:00:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:21.254389 | orchestrator | 2026-04-20 02:00:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:24.305440 | orchestrator | 2026-04-20 02:00:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:24.308269 | orchestrator | 2026-04-20 02:00:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:24.308329 | orchestrator | 2026-04-20 02:00:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:27.350177 | orchestrator | 2026-04-20 02:00:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:27.352835 | orchestrator | 2026-04-20 02:00:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:27.353412 | orchestrator | 2026-04-20 02:00:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:30.400990 | orchestrator | 2026-04-20 02:00:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:30.403824 | orchestrator | 2026-04-20 02:00:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:30.403901 | orchestrator | 2026-04-20 02:00:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:33.445596 | orchestrator | 2026-04-20 02:00:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:33.447217 | orchestrator | 2026-04-20 02:00:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:33.447262 | orchestrator | 2026-04-20 02:00:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:36.492877 | orchestrator | 2026-04-20 02:00:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:36.495019 | orchestrator | 2026-04-20 02:00:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:36.495093 | orchestrator | 2026-04-20 02:00:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:39.539945 | orchestrator | 2026-04-20 02:00:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:39.541449 | orchestrator | 2026-04-20 02:00:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:39.541514 | orchestrator | 2026-04-20 02:00:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:42.587867 | orchestrator | 2026-04-20 02:00:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:42.589633 | orchestrator | 2026-04-20 02:00:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:42.589671 | orchestrator | 2026-04-20 02:00:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:45.636753 | orchestrator | 2026-04-20 02:00:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:45.639632 | orchestrator | 2026-04-20 02:00:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:45.639697 | orchestrator | 2026-04-20 02:00:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:48.688480 | orchestrator | 2026-04-20 02:00:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:48.690603 | orchestrator | 2026-04-20 02:00:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:48.690655 | orchestrator | 2026-04-20 02:00:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:51.731994 | orchestrator | 2026-04-20 02:00:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:51.732890 | orchestrator | 2026-04-20 02:00:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:51.733118 | orchestrator | 2026-04-20 02:00:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:54.776481 | orchestrator | 2026-04-20 02:00:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:54.777140 | orchestrator | 2026-04-20 02:00:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:54.777177 | orchestrator | 2026-04-20 02:00:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:00:57.822373 | orchestrator | 2026-04-20 02:00:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:00:57.825693 | orchestrator | 2026-04-20 02:00:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:00:57.825888 | orchestrator | 2026-04-20 02:00:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:00.870344 | orchestrator | 2026-04-20 02:01:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:00.873305 | orchestrator | 2026-04-20 02:01:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:00.873388 | orchestrator | 2026-04-20 02:01:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:03.921865 | orchestrator | 2026-04-20 02:01:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:03.923341 | orchestrator | 2026-04-20 02:01:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:03.923386 | orchestrator | 2026-04-20 02:01:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:06.974856 | orchestrator | 2026-04-20 02:01:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:06.977165 | orchestrator | 2026-04-20 02:01:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:06.977234 | orchestrator | 2026-04-20 02:01:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:10.021224 | orchestrator | 2026-04-20 02:01:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:10.022492 | orchestrator | 2026-04-20 02:01:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:10.023190 | orchestrator | 2026-04-20 02:01:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:13.067468 | orchestrator | 2026-04-20 02:01:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:13.068629 | orchestrator | 2026-04-20 02:01:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:13.068690 | orchestrator | 2026-04-20 02:01:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:16.117571 | orchestrator | 2026-04-20 02:01:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:16.119626 | orchestrator | 2026-04-20 02:01:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:16.119686 | orchestrator | 2026-04-20 02:01:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:19.169571 | orchestrator | 2026-04-20 02:01:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:19.171030 | orchestrator | 2026-04-20 02:01:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:19.171115 | orchestrator | 2026-04-20 02:01:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:22.212891 | orchestrator | 2026-04-20 02:01:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:22.213292 | orchestrator | 2026-04-20 02:01:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:22.213324 | orchestrator | 2026-04-20 02:01:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:25.260587 | orchestrator | 2026-04-20 02:01:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:25.261841 | orchestrator | 2026-04-20 02:01:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:25.261968 | orchestrator | 2026-04-20 02:01:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:28.310154 | orchestrator | 2026-04-20 02:01:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:28.311688 | orchestrator | 2026-04-20 02:01:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:28.311740 | orchestrator | 2026-04-20 02:01:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:31.365385 | orchestrator | 2026-04-20 02:01:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:31.367025 | orchestrator | 2026-04-20 02:01:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:31.367069 | orchestrator | 2026-04-20 02:01:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:34.405620 | orchestrator | 2026-04-20 02:01:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:34.407237 | orchestrator | 2026-04-20 02:01:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:34.407315 | orchestrator | 2026-04-20 02:01:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:37.453128 | orchestrator | 2026-04-20 02:01:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:37.455354 | orchestrator | 2026-04-20 02:01:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:37.455450 | orchestrator | 2026-04-20 02:01:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:40.501533 | orchestrator | 2026-04-20 02:01:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:40.503086 | orchestrator | 2026-04-20 02:01:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:40.503253 | orchestrator | 2026-04-20 02:01:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:43.550353 | orchestrator | 2026-04-20 02:01:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:43.552507 | orchestrator | 2026-04-20 02:01:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:43.552558 | orchestrator | 2026-04-20 02:01:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:46.598168 | orchestrator | 2026-04-20 02:01:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:46.599187 | orchestrator | 2026-04-20 02:01:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:46.599231 | orchestrator | 2026-04-20 02:01:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:49.645007 | orchestrator | 2026-04-20 02:01:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:49.646351 | orchestrator | 2026-04-20 02:01:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:49.646419 | orchestrator | 2026-04-20 02:01:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:52.691514 | orchestrator | 2026-04-20 02:01:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:52.693129 | orchestrator | 2026-04-20 02:01:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:52.693227 | orchestrator | 2026-04-20 02:01:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:55.735789 | orchestrator | 2026-04-20 02:01:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:55.737295 | orchestrator | 2026-04-20 02:01:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:55.737351 | orchestrator | 2026-04-20 02:01:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:01:58.781983 | orchestrator | 2026-04-20 02:01:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:01:58.783405 | orchestrator | 2026-04-20 02:01:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:01:58.783428 | orchestrator | 2026-04-20 02:01:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:01.834213 | orchestrator | 2026-04-20 02:02:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:01.836471 | orchestrator | 2026-04-20 02:02:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:01.836521 | orchestrator | 2026-04-20 02:02:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:04.881468 | orchestrator | 2026-04-20 02:02:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:04.882435 | orchestrator | 2026-04-20 02:02:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:04.882477 | orchestrator | 2026-04-20 02:02:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:07.929030 | orchestrator | 2026-04-20 02:02:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:07.931456 | orchestrator | 2026-04-20 02:02:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:07.931552 | orchestrator | 2026-04-20 02:02:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:10.979634 | orchestrator | 2026-04-20 02:02:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:10.981348 | orchestrator | 2026-04-20 02:02:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:10.981400 | orchestrator | 2026-04-20 02:02:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:14.026562 | orchestrator | 2026-04-20 02:02:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:14.029018 | orchestrator | 2026-04-20 02:02:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:14.029114 | orchestrator | 2026-04-20 02:02:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:17.080909 | orchestrator | 2026-04-20 02:02:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:17.083476 | orchestrator | 2026-04-20 02:02:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:17.083601 | orchestrator | 2026-04-20 02:02:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:20.121554 | orchestrator | 2026-04-20 02:02:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:20.122846 | orchestrator | 2026-04-20 02:02:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:20.122880 | orchestrator | 2026-04-20 02:02:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:23.158525 | orchestrator | 2026-04-20 02:02:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:23.159428 | orchestrator | 2026-04-20 02:02:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:23.159466 | orchestrator | 2026-04-20 02:02:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:26.193479 | orchestrator | 2026-04-20 02:02:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:26.194966 | orchestrator | 2026-04-20 02:02:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:26.195009 | orchestrator | 2026-04-20 02:02:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:29.238216 | orchestrator | 2026-04-20 02:02:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:29.240213 | orchestrator | 2026-04-20 02:02:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:29.240292 | orchestrator | 2026-04-20 02:02:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:32.292622 | orchestrator | 2026-04-20 02:02:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:32.294349 | orchestrator | 2026-04-20 02:02:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:32.294434 | orchestrator | 2026-04-20 02:02:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:35.337854 | orchestrator | 2026-04-20 02:02:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:35.339366 | orchestrator | 2026-04-20 02:02:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:35.339410 | orchestrator | 2026-04-20 02:02:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:38.384009 | orchestrator | 2026-04-20 02:02:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:38.386307 | orchestrator | 2026-04-20 02:02:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:38.386591 | orchestrator | 2026-04-20 02:02:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:41.428361 | orchestrator | 2026-04-20 02:02:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:41.431299 | orchestrator | 2026-04-20 02:02:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:41.431397 | orchestrator | 2026-04-20 02:02:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:44.484066 | orchestrator | 2026-04-20 02:02:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:44.487788 | orchestrator | 2026-04-20 02:02:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:44.487854 | orchestrator | 2026-04-20 02:02:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:47.543256 | orchestrator | 2026-04-20 02:02:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:47.545534 | orchestrator | 2026-04-20 02:02:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:47.545610 | orchestrator | 2026-04-20 02:02:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:50.598585 | orchestrator | 2026-04-20 02:02:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:50.600213 | orchestrator | 2026-04-20 02:02:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:50.600248 | orchestrator | 2026-04-20 02:02:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:53.654002 | orchestrator | 2026-04-20 02:02:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:53.656072 | orchestrator | 2026-04-20 02:02:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:53.656204 | orchestrator | 2026-04-20 02:02:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:56.707064 | orchestrator | 2026-04-20 02:02:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:56.708413 | orchestrator | 2026-04-20 02:02:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:56.708465 | orchestrator | 2026-04-20 02:02:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:02:59.753647 | orchestrator | 2026-04-20 02:02:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:02:59.755775 | orchestrator | 2026-04-20 02:02:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:02:59.755959 | orchestrator | 2026-04-20 02:02:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:02.808922 | orchestrator | 2026-04-20 02:03:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:02.811508 | orchestrator | 2026-04-20 02:03:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:02.811684 | orchestrator | 2026-04-20 02:03:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:05.857452 | orchestrator | 2026-04-20 02:03:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:05.858848 | orchestrator | 2026-04-20 02:03:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:05.858908 | orchestrator | 2026-04-20 02:03:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:08.908567 | orchestrator | 2026-04-20 02:03:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:08.910321 | orchestrator | 2026-04-20 02:03:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:08.910379 | orchestrator | 2026-04-20 02:03:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:11.957083 | orchestrator | 2026-04-20 02:03:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:11.959388 | orchestrator | 2026-04-20 02:03:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:11.959430 | orchestrator | 2026-04-20 02:03:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:15.001361 | orchestrator | 2026-04-20 02:03:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:15.002581 | orchestrator | 2026-04-20 02:03:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:15.002660 | orchestrator | 2026-04-20 02:03:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:18.057275 | orchestrator | 2026-04-20 02:03:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:18.058264 | orchestrator | 2026-04-20 02:03:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:18.058369 | orchestrator | 2026-04-20 02:03:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:21.108688 | orchestrator | 2026-04-20 02:03:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:21.110383 | orchestrator | 2026-04-20 02:03:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:21.110436 | orchestrator | 2026-04-20 02:03:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:24.162699 | orchestrator | 2026-04-20 02:03:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:24.164113 | orchestrator | 2026-04-20 02:03:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:24.164283 | orchestrator | 2026-04-20 02:03:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:27.209258 | orchestrator | 2026-04-20 02:03:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:27.209988 | orchestrator | 2026-04-20 02:03:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:27.210235 | orchestrator | 2026-04-20 02:03:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:30.255970 | orchestrator | 2026-04-20 02:03:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:30.256289 | orchestrator | 2026-04-20 02:03:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:30.256312 | orchestrator | 2026-04-20 02:03:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:33.306124 | orchestrator | 2026-04-20 02:03:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:33.307735 | orchestrator | 2026-04-20 02:03:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:33.307810 | orchestrator | 2026-04-20 02:03:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:36.348440 | orchestrator | 2026-04-20 02:03:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:36.349729 | orchestrator | 2026-04-20 02:03:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:36.349895 | orchestrator | 2026-04-20 02:03:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:39.394637 | orchestrator | 2026-04-20 02:03:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:39.395215 | orchestrator | 2026-04-20 02:03:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:39.395253 | orchestrator | 2026-04-20 02:03:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:42.440554 | orchestrator | 2026-04-20 02:03:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:42.441847 | orchestrator | 2026-04-20 02:03:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:42.442235 | orchestrator | 2026-04-20 02:03:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:45.495431 | orchestrator | 2026-04-20 02:03:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:45.497214 | orchestrator | 2026-04-20 02:03:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:45.497386 | orchestrator | 2026-04-20 02:03:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:48.553351 | orchestrator | 2026-04-20 02:03:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:48.554275 | orchestrator | 2026-04-20 02:03:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:48.554445 | orchestrator | 2026-04-20 02:03:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:51.602512 | orchestrator | 2026-04-20 02:03:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:51.604940 | orchestrator | 2026-04-20 02:03:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:51.605123 | orchestrator | 2026-04-20 02:03:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:54.653272 | orchestrator | 2026-04-20 02:03:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:54.654327 | orchestrator | 2026-04-20 02:03:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:54.654414 | orchestrator | 2026-04-20 02:03:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:03:57.706546 | orchestrator | 2026-04-20 02:03:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:03:57.708405 | orchestrator | 2026-04-20 02:03:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:03:57.708473 | orchestrator | 2026-04-20 02:03:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:00.752864 | orchestrator | 2026-04-20 02:04:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:00.754375 | orchestrator | 2026-04-20 02:04:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:00.754416 | orchestrator | 2026-04-20 02:04:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:03.799158 | orchestrator | 2026-04-20 02:04:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:03.801138 | orchestrator | 2026-04-20 02:04:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:03.801285 | orchestrator | 2026-04-20 02:04:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:06.851337 | orchestrator | 2026-04-20 02:04:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:06.852350 | orchestrator | 2026-04-20 02:04:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:06.852386 | orchestrator | 2026-04-20 02:04:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:09.901737 | orchestrator | 2026-04-20 02:04:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:09.904067 | orchestrator | 2026-04-20 02:04:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:09.904151 | orchestrator | 2026-04-20 02:04:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:12.945679 | orchestrator | 2026-04-20 02:04:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:12.946254 | orchestrator | 2026-04-20 02:04:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:12.946431 | orchestrator | 2026-04-20 02:04:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:15.991588 | orchestrator | 2026-04-20 02:04:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:15.993136 | orchestrator | 2026-04-20 02:04:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:15.993284 | orchestrator | 2026-04-20 02:04:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:19.042953 | orchestrator | 2026-04-20 02:04:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:19.044987 | orchestrator | 2026-04-20 02:04:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:19.045060 | orchestrator | 2026-04-20 02:04:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:22.088414 | orchestrator | 2026-04-20 02:04:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:22.089645 | orchestrator | 2026-04-20 02:04:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:22.089727 | orchestrator | 2026-04-20 02:04:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:25.132672 | orchestrator | 2026-04-20 02:04:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:25.134522 | orchestrator | 2026-04-20 02:04:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:25.134626 | orchestrator | 2026-04-20 02:04:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:28.177631 | orchestrator | 2026-04-20 02:04:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:28.179632 | orchestrator | 2026-04-20 02:04:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:28.179704 | orchestrator | 2026-04-20 02:04:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:31.224163 | orchestrator | 2026-04-20 02:04:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:31.225801 | orchestrator | 2026-04-20 02:04:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:31.225881 | orchestrator | 2026-04-20 02:04:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:34.276715 | orchestrator | 2026-04-20 02:04:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:34.278376 | orchestrator | 2026-04-20 02:04:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:34.278430 | orchestrator | 2026-04-20 02:04:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:37.337471 | orchestrator | 2026-04-20 02:04:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:37.338609 | orchestrator | 2026-04-20 02:04:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:37.338659 | orchestrator | 2026-04-20 02:04:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:40.390486 | orchestrator | 2026-04-20 02:04:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:40.391445 | orchestrator | 2026-04-20 02:04:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:40.391501 | orchestrator | 2026-04-20 02:04:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:43.444587 | orchestrator | 2026-04-20 02:04:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:43.447607 | orchestrator | 2026-04-20 02:04:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:43.447662 | orchestrator | 2026-04-20 02:04:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:46.496081 | orchestrator | 2026-04-20 02:04:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:46.498081 | orchestrator | 2026-04-20 02:04:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:46.498114 | orchestrator | 2026-04-20 02:04:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:49.540589 | orchestrator | 2026-04-20 02:04:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:49.542156 | orchestrator | 2026-04-20 02:04:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:49.542286 | orchestrator | 2026-04-20 02:04:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:52.589133 | orchestrator | 2026-04-20 02:04:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:52.591162 | orchestrator | 2026-04-20 02:04:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:52.591220 | orchestrator | 2026-04-20 02:04:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:55.632892 | orchestrator | 2026-04-20 02:04:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:55.634313 | orchestrator | 2026-04-20 02:04:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:55.634352 | orchestrator | 2026-04-20 02:04:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:04:58.675066 | orchestrator | 2026-04-20 02:04:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:04:58.676791 | orchestrator | 2026-04-20 02:04:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:04:58.676858 | orchestrator | 2026-04-20 02:04:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:01.722646 | orchestrator | 2026-04-20 02:05:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:01.724090 | orchestrator | 2026-04-20 02:05:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:01.724126 | orchestrator | 2026-04-20 02:05:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:04.772176 | orchestrator | 2026-04-20 02:05:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:04.773286 | orchestrator | 2026-04-20 02:05:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:04.773306 | orchestrator | 2026-04-20 02:05:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:07.821757 | orchestrator | 2026-04-20 02:05:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:07.824512 | orchestrator | 2026-04-20 02:05:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:07.824583 | orchestrator | 2026-04-20 02:05:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:10.868302 | orchestrator | 2026-04-20 02:05:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:10.870143 | orchestrator | 2026-04-20 02:05:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:10.870187 | orchestrator | 2026-04-20 02:05:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:13.917319 | orchestrator | 2026-04-20 02:05:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:13.920811 | orchestrator | 2026-04-20 02:05:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:13.920946 | orchestrator | 2026-04-20 02:05:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:16.965089 | orchestrator | 2026-04-20 02:05:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:16.966865 | orchestrator | 2026-04-20 02:05:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:16.966932 | orchestrator | 2026-04-20 02:05:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:20.017217 | orchestrator | 2026-04-20 02:05:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:20.019224 | orchestrator | 2026-04-20 02:05:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:20.019319 | orchestrator | 2026-04-20 02:05:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:23.064827 | orchestrator | 2026-04-20 02:05:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:23.066715 | orchestrator | 2026-04-20 02:05:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:23.066970 | orchestrator | 2026-04-20 02:05:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:26.114453 | orchestrator | 2026-04-20 02:05:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:26.117226 | orchestrator | 2026-04-20 02:05:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:26.117379 | orchestrator | 2026-04-20 02:05:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:29.159690 | orchestrator | 2026-04-20 02:05:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:29.163547 | orchestrator | 2026-04-20 02:05:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:29.163622 | orchestrator | 2026-04-20 02:05:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:32.212657 | orchestrator | 2026-04-20 02:05:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:32.214047 | orchestrator | 2026-04-20 02:05:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:32.214076 | orchestrator | 2026-04-20 02:05:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:35.267555 | orchestrator | 2026-04-20 02:05:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:35.268394 | orchestrator | 2026-04-20 02:05:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:35.268469 | orchestrator | 2026-04-20 02:05:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:38.321841 | orchestrator | 2026-04-20 02:05:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:38.325411 | orchestrator | 2026-04-20 02:05:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:38.325483 | orchestrator | 2026-04-20 02:05:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:41.364734 | orchestrator | 2026-04-20 02:05:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:41.366653 | orchestrator | 2026-04-20 02:05:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:41.366697 | orchestrator | 2026-04-20 02:05:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:44.412526 | orchestrator | 2026-04-20 02:05:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:44.414376 | orchestrator | 2026-04-20 02:05:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:44.414444 | orchestrator | 2026-04-20 02:05:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:47.462948 | orchestrator | 2026-04-20 02:05:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:47.464860 | orchestrator | 2026-04-20 02:05:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:47.464917 | orchestrator | 2026-04-20 02:05:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:50.509529 | orchestrator | 2026-04-20 02:05:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:50.510486 | orchestrator | 2026-04-20 02:05:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:50.510540 | orchestrator | 2026-04-20 02:05:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:53.552746 | orchestrator | 2026-04-20 02:05:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:53.554390 | orchestrator | 2026-04-20 02:05:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:53.554446 | orchestrator | 2026-04-20 02:05:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:56.593463 | orchestrator | 2026-04-20 02:05:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:56.593876 | orchestrator | 2026-04-20 02:05:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:56.593978 | orchestrator | 2026-04-20 02:05:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:05:59.629156 | orchestrator | 2026-04-20 02:05:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:05:59.630808 | orchestrator | 2026-04-20 02:05:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:05:59.630902 | orchestrator | 2026-04-20 02:05:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:02.675732 | orchestrator | 2026-04-20 02:06:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:02.678945 | orchestrator | 2026-04-20 02:06:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:02.679023 | orchestrator | 2026-04-20 02:06:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:05.728532 | orchestrator | 2026-04-20 02:06:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:05.730465 | orchestrator | 2026-04-20 02:06:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:05.730516 | orchestrator | 2026-04-20 02:06:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:08.778556 | orchestrator | 2026-04-20 02:06:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:08.780945 | orchestrator | 2026-04-20 02:06:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:08.781245 | orchestrator | 2026-04-20 02:06:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:11.835734 | orchestrator | 2026-04-20 02:06:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:11.837042 | orchestrator | 2026-04-20 02:06:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:11.837111 | orchestrator | 2026-04-20 02:06:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:14.891627 | orchestrator | 2026-04-20 02:06:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:14.893101 | orchestrator | 2026-04-20 02:06:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:14.893175 | orchestrator | 2026-04-20 02:06:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:17.943099 | orchestrator | 2026-04-20 02:06:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:17.945165 | orchestrator | 2026-04-20 02:06:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:17.945252 | orchestrator | 2026-04-20 02:06:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:20.999230 | orchestrator | 2026-04-20 02:06:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:21.003840 | orchestrator | 2026-04-20 02:06:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:21.003899 | orchestrator | 2026-04-20 02:06:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:24.036450 | orchestrator | 2026-04-20 02:06:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:24.037348 | orchestrator | 2026-04-20 02:06:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:24.037380 | orchestrator | 2026-04-20 02:06:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:27.076624 | orchestrator | 2026-04-20 02:06:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:27.077453 | orchestrator | 2026-04-20 02:06:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:27.077478 | orchestrator | 2026-04-20 02:06:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:30.110863 | orchestrator | 2026-04-20 02:06:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:30.111939 | orchestrator | 2026-04-20 02:06:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:30.112047 | orchestrator | 2026-04-20 02:06:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:33.150979 | orchestrator | 2026-04-20 02:06:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:33.153212 | orchestrator | 2026-04-20 02:06:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:33.153268 | orchestrator | 2026-04-20 02:06:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:36.198132 | orchestrator | 2026-04-20 02:06:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:36.199958 | orchestrator | 2026-04-20 02:06:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:36.200017 | orchestrator | 2026-04-20 02:06:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:39.237925 | orchestrator | 2026-04-20 02:06:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:39.238094 | orchestrator | 2026-04-20 02:06:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:39.238108 | orchestrator | 2026-04-20 02:06:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:42.286760 | orchestrator | 2026-04-20 02:06:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:42.288959 | orchestrator | 2026-04-20 02:06:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:42.289025 | orchestrator | 2026-04-20 02:06:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:45.346265 | orchestrator | 2026-04-20 02:06:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:45.348403 | orchestrator | 2026-04-20 02:06:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:45.348465 | orchestrator | 2026-04-20 02:06:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:48.396183 | orchestrator | 2026-04-20 02:06:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:48.398467 | orchestrator | 2026-04-20 02:06:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:48.398599 | orchestrator | 2026-04-20 02:06:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:51.455839 | orchestrator | 2026-04-20 02:06:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:51.458125 | orchestrator | 2026-04-20 02:06:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:51.458222 | orchestrator | 2026-04-20 02:06:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:54.507773 | orchestrator | 2026-04-20 02:06:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:54.510798 | orchestrator | 2026-04-20 02:06:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:54.510876 | orchestrator | 2026-04-20 02:06:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:06:57.563513 | orchestrator | 2026-04-20 02:06:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:06:57.566589 | orchestrator | 2026-04-20 02:06:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:06:57.566692 | orchestrator | 2026-04-20 02:06:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:00.621534 | orchestrator | 2026-04-20 02:07:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:00.628501 | orchestrator | 2026-04-20 02:07:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:00.628600 | orchestrator | 2026-04-20 02:07:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:03.684009 | orchestrator | 2026-04-20 02:07:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:03.686147 | orchestrator | 2026-04-20 02:07:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:03.686205 | orchestrator | 2026-04-20 02:07:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:06.735707 | orchestrator | 2026-04-20 02:07:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:06.739134 | orchestrator | 2026-04-20 02:07:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:06.739227 | orchestrator | 2026-04-20 02:07:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:09.790950 | orchestrator | 2026-04-20 02:07:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:09.792899 | orchestrator | 2026-04-20 02:07:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:09.792988 | orchestrator | 2026-04-20 02:07:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:12.847569 | orchestrator | 2026-04-20 02:07:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:12.848995 | orchestrator | 2026-04-20 02:07:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:12.849030 | orchestrator | 2026-04-20 02:07:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:15.901847 | orchestrator | 2026-04-20 02:07:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:15.905776 | orchestrator | 2026-04-20 02:07:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:15.906332 | orchestrator | 2026-04-20 02:07:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:18.954963 | orchestrator | 2026-04-20 02:07:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:18.956911 | orchestrator | 2026-04-20 02:07:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:18.956984 | orchestrator | 2026-04-20 02:07:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:22.007746 | orchestrator | 2026-04-20 02:07:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:22.009879 | orchestrator | 2026-04-20 02:07:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:22.009932 | orchestrator | 2026-04-20 02:07:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:25.066297 | orchestrator | 2026-04-20 02:07:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:25.068301 | orchestrator | 2026-04-20 02:07:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:25.068479 | orchestrator | 2026-04-20 02:07:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:28.122147 | orchestrator | 2026-04-20 02:07:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:28.124051 | orchestrator | 2026-04-20 02:07:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:28.124133 | orchestrator | 2026-04-20 02:07:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:31.172186 | orchestrator | 2026-04-20 02:07:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:31.173512 | orchestrator | 2026-04-20 02:07:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:31.173579 | orchestrator | 2026-04-20 02:07:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:34.220303 | orchestrator | 2026-04-20 02:07:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:34.222116 | orchestrator | 2026-04-20 02:07:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:34.222172 | orchestrator | 2026-04-20 02:07:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:37.269797 | orchestrator | 2026-04-20 02:07:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:37.270813 | orchestrator | 2026-04-20 02:07:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:37.271315 | orchestrator | 2026-04-20 02:07:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:40.319418 | orchestrator | 2026-04-20 02:07:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:40.320597 | orchestrator | 2026-04-20 02:07:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:40.320637 | orchestrator | 2026-04-20 02:07:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:43.369868 | orchestrator | 2026-04-20 02:07:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:43.370599 | orchestrator | 2026-04-20 02:07:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:43.370838 | orchestrator | 2026-04-20 02:07:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:46.420596 | orchestrator | 2026-04-20 02:07:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:46.421443 | orchestrator | 2026-04-20 02:07:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:46.421483 | orchestrator | 2026-04-20 02:07:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:49.472313 | orchestrator | 2026-04-20 02:07:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:49.474290 | orchestrator | 2026-04-20 02:07:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:49.474337 | orchestrator | 2026-04-20 02:07:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:52.526575 | orchestrator | 2026-04-20 02:07:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:52.527450 | orchestrator | 2026-04-20 02:07:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:52.527486 | orchestrator | 2026-04-20 02:07:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:55.577467 | orchestrator | 2026-04-20 02:07:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:55.579397 | orchestrator | 2026-04-20 02:07:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:55.579467 | orchestrator | 2026-04-20 02:07:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:07:58.631803 | orchestrator | 2026-04-20 02:07:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:07:58.633091 | orchestrator | 2026-04-20 02:07:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:07:58.633146 | orchestrator | 2026-04-20 02:07:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:01.681420 | orchestrator | 2026-04-20 02:08:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:01.684476 | orchestrator | 2026-04-20 02:08:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:01.684696 | orchestrator | 2026-04-20 02:08:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:04.727953 | orchestrator | 2026-04-20 02:08:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:04.729642 | orchestrator | 2026-04-20 02:08:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:04.729694 | orchestrator | 2026-04-20 02:08:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:07.773933 | orchestrator | 2026-04-20 02:08:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:07.776033 | orchestrator | 2026-04-20 02:08:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:07.776100 | orchestrator | 2026-04-20 02:08:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:10.826288 | orchestrator | 2026-04-20 02:08:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:10.827756 | orchestrator | 2026-04-20 02:08:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:10.827808 | orchestrator | 2026-04-20 02:08:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:13.882260 | orchestrator | 2026-04-20 02:08:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:13.882552 | orchestrator | 2026-04-20 02:08:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:13.882586 | orchestrator | 2026-04-20 02:08:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:16.934121 | orchestrator | 2026-04-20 02:08:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:16.937273 | orchestrator | 2026-04-20 02:08:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:16.937385 | orchestrator | 2026-04-20 02:08:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:19.984298 | orchestrator | 2026-04-20 02:08:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:19.986804 | orchestrator | 2026-04-20 02:08:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:19.986945 | orchestrator | 2026-04-20 02:08:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:23.039269 | orchestrator | 2026-04-20 02:08:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:23.040811 | orchestrator | 2026-04-20 02:08:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:23.040871 | orchestrator | 2026-04-20 02:08:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:26.086420 | orchestrator | 2026-04-20 02:08:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:26.088306 | orchestrator | 2026-04-20 02:08:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:26.088382 | orchestrator | 2026-04-20 02:08:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:29.139842 | orchestrator | 2026-04-20 02:08:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:29.142118 | orchestrator | 2026-04-20 02:08:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:29.142172 | orchestrator | 2026-04-20 02:08:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:32.188574 | orchestrator | 2026-04-20 02:08:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:32.190287 | orchestrator | 2026-04-20 02:08:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:32.190318 | orchestrator | 2026-04-20 02:08:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:35.247584 | orchestrator | 2026-04-20 02:08:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:35.249293 | orchestrator | 2026-04-20 02:08:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:35.249363 | orchestrator | 2026-04-20 02:08:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:38.296190 | orchestrator | 2026-04-20 02:08:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:38.297818 | orchestrator | 2026-04-20 02:08:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:38.297868 | orchestrator | 2026-04-20 02:08:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:41.346379 | orchestrator | 2026-04-20 02:08:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:41.348138 | orchestrator | 2026-04-20 02:08:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:41.348215 | orchestrator | 2026-04-20 02:08:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:44.401681 | orchestrator | 2026-04-20 02:08:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:44.403226 | orchestrator | 2026-04-20 02:08:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:44.403355 | orchestrator | 2026-04-20 02:08:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:47.451694 | orchestrator | 2026-04-20 02:08:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:47.453046 | orchestrator | 2026-04-20 02:08:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:47.453119 | orchestrator | 2026-04-20 02:08:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:50.505005 | orchestrator | 2026-04-20 02:08:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:50.506395 | orchestrator | 2026-04-20 02:08:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:50.506457 | orchestrator | 2026-04-20 02:08:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:53.558233 | orchestrator | 2026-04-20 02:08:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:53.561277 | orchestrator | 2026-04-20 02:08:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:53.561328 | orchestrator | 2026-04-20 02:08:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:56.612233 | orchestrator | 2026-04-20 02:08:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:56.614844 | orchestrator | 2026-04-20 02:08:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:56.614896 | orchestrator | 2026-04-20 02:08:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:08:59.667997 | orchestrator | 2026-04-20 02:08:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:08:59.670857 | orchestrator | 2026-04-20 02:08:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:08:59.670909 | orchestrator | 2026-04-20 02:08:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:02.719704 | orchestrator | 2026-04-20 02:09:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:02.721273 | orchestrator | 2026-04-20 02:09:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:02.721401 | orchestrator | 2026-04-20 02:09:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:05.768382 | orchestrator | 2026-04-20 02:09:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:05.771258 | orchestrator | 2026-04-20 02:09:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:05.771437 | orchestrator | 2026-04-20 02:09:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:08.822564 | orchestrator | 2026-04-20 02:09:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:08.824170 | orchestrator | 2026-04-20 02:09:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:08.824240 | orchestrator | 2026-04-20 02:09:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:11.874772 | orchestrator | 2026-04-20 02:09:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:11.876245 | orchestrator | 2026-04-20 02:09:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:11.876301 | orchestrator | 2026-04-20 02:09:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:14.933459 | orchestrator | 2026-04-20 02:09:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:14.935452 | orchestrator | 2026-04-20 02:09:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:14.935603 | orchestrator | 2026-04-20 02:09:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:17.979938 | orchestrator | 2026-04-20 02:09:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:17.983519 | orchestrator | 2026-04-20 02:09:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:17.983821 | orchestrator | 2026-04-20 02:09:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:21.025231 | orchestrator | 2026-04-20 02:09:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:21.026736 | orchestrator | 2026-04-20 02:09:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:21.026788 | orchestrator | 2026-04-20 02:09:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:24.069526 | orchestrator | 2026-04-20 02:09:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:24.071584 | orchestrator | 2026-04-20 02:09:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:24.071687 | orchestrator | 2026-04-20 02:09:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:27.136421 | orchestrator | 2026-04-20 02:09:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:27.137607 | orchestrator | 2026-04-20 02:09:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:27.137634 | orchestrator | 2026-04-20 02:09:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:30.191932 | orchestrator | 2026-04-20 02:09:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:30.194365 | orchestrator | 2026-04-20 02:09:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:30.194888 | orchestrator | 2026-04-20 02:09:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:33.247374 | orchestrator | 2026-04-20 02:09:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:33.250551 | orchestrator | 2026-04-20 02:09:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:33.250630 | orchestrator | 2026-04-20 02:09:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:36.299955 | orchestrator | 2026-04-20 02:09:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:36.302796 | orchestrator | 2026-04-20 02:09:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:36.302840 | orchestrator | 2026-04-20 02:09:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:39.351011 | orchestrator | 2026-04-20 02:09:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:39.352295 | orchestrator | 2026-04-20 02:09:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:39.352388 | orchestrator | 2026-04-20 02:09:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:42.400180 | orchestrator | 2026-04-20 02:09:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:42.403492 | orchestrator | 2026-04-20 02:09:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:42.403585 | orchestrator | 2026-04-20 02:09:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:45.456759 | orchestrator | 2026-04-20 02:09:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:45.459050 | orchestrator | 2026-04-20 02:09:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:45.459100 | orchestrator | 2026-04-20 02:09:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:48.503191 | orchestrator | 2026-04-20 02:09:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:48.505637 | orchestrator | 2026-04-20 02:09:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:48.506063 | orchestrator | 2026-04-20 02:09:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:51.548123 | orchestrator | 2026-04-20 02:09:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:51.549590 | orchestrator | 2026-04-20 02:09:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:51.549646 | orchestrator | 2026-04-20 02:09:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:54.593415 | orchestrator | 2026-04-20 02:09:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:54.594516 | orchestrator | 2026-04-20 02:09:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:54.594588 | orchestrator | 2026-04-20 02:09:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:09:57.640465 | orchestrator | 2026-04-20 02:09:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:09:57.641962 | orchestrator | 2026-04-20 02:09:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:09:57.642002 | orchestrator | 2026-04-20 02:09:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:00.690928 | orchestrator | 2026-04-20 02:10:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:00.692656 | orchestrator | 2026-04-20 02:10:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:00.692727 | orchestrator | 2026-04-20 02:10:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:03.740547 | orchestrator | 2026-04-20 02:10:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:03.741357 | orchestrator | 2026-04-20 02:10:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:03.741450 | orchestrator | 2026-04-20 02:10:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:06.789629 | orchestrator | 2026-04-20 02:10:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:06.790410 | orchestrator | 2026-04-20 02:10:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:06.790500 | orchestrator | 2026-04-20 02:10:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:09.839932 | orchestrator | 2026-04-20 02:10:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:09.842007 | orchestrator | 2026-04-20 02:10:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:09.842146 | orchestrator | 2026-04-20 02:10:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:12.889794 | orchestrator | 2026-04-20 02:10:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:12.890719 | orchestrator | 2026-04-20 02:10:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:12.891060 | orchestrator | 2026-04-20 02:10:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:15.939762 | orchestrator | 2026-04-20 02:10:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:15.941497 | orchestrator | 2026-04-20 02:10:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:15.941557 | orchestrator | 2026-04-20 02:10:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:18.985237 | orchestrator | 2026-04-20 02:10:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:18.989828 | orchestrator | 2026-04-20 02:10:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:18.989918 | orchestrator | 2026-04-20 02:10:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:22.023415 | orchestrator | 2026-04-20 02:10:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:22.025984 | orchestrator | 2026-04-20 02:10:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:22.026115 | orchestrator | 2026-04-20 02:10:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:25.073869 | orchestrator | 2026-04-20 02:10:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:25.075562 | orchestrator | 2026-04-20 02:10:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:25.075609 | orchestrator | 2026-04-20 02:10:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:28.128728 | orchestrator | 2026-04-20 02:10:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:28.130774 | orchestrator | 2026-04-20 02:10:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:28.130817 | orchestrator | 2026-04-20 02:10:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:31.175882 | orchestrator | 2026-04-20 02:10:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:31.177956 | orchestrator | 2026-04-20 02:10:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:31.178086 | orchestrator | 2026-04-20 02:10:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:34.229098 | orchestrator | 2026-04-20 02:10:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:34.232448 | orchestrator | 2026-04-20 02:10:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:34.232518 | orchestrator | 2026-04-20 02:10:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:37.284851 | orchestrator | 2026-04-20 02:10:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:37.286575 | orchestrator | 2026-04-20 02:10:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:37.286961 | orchestrator | 2026-04-20 02:10:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:40.340872 | orchestrator | 2026-04-20 02:10:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:40.342485 | orchestrator | 2026-04-20 02:10:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:40.342593 | orchestrator | 2026-04-20 02:10:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:43.389760 | orchestrator | 2026-04-20 02:10:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:43.391016 | orchestrator | 2026-04-20 02:10:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:43.391041 | orchestrator | 2026-04-20 02:10:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:46.438905 | orchestrator | 2026-04-20 02:10:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:46.441576 | orchestrator | 2026-04-20 02:10:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:46.441734 | orchestrator | 2026-04-20 02:10:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:49.487465 | orchestrator | 2026-04-20 02:10:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:49.488358 | orchestrator | 2026-04-20 02:10:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:49.488390 | orchestrator | 2026-04-20 02:10:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:52.531146 | orchestrator | 2026-04-20 02:10:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:52.533161 | orchestrator | 2026-04-20 02:10:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:52.533198 | orchestrator | 2026-04-20 02:10:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:55.580952 | orchestrator | 2026-04-20 02:10:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:55.582316 | orchestrator | 2026-04-20 02:10:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:55.582472 | orchestrator | 2026-04-20 02:10:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:10:58.634662 | orchestrator | 2026-04-20 02:10:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:10:58.637120 | orchestrator | 2026-04-20 02:10:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:10:58.637207 | orchestrator | 2026-04-20 02:10:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:01.682896 | orchestrator | 2026-04-20 02:11:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:01.685877 | orchestrator | 2026-04-20 02:11:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:01.685945 | orchestrator | 2026-04-20 02:11:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:04.737614 | orchestrator | 2026-04-20 02:11:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:04.739316 | orchestrator | 2026-04-20 02:11:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:04.739406 | orchestrator | 2026-04-20 02:11:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:07.784893 | orchestrator | 2026-04-20 02:11:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:07.786535 | orchestrator | 2026-04-20 02:11:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:07.786574 | orchestrator | 2026-04-20 02:11:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:10.835081 | orchestrator | 2026-04-20 02:11:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:10.836881 | orchestrator | 2026-04-20 02:11:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:10.836940 | orchestrator | 2026-04-20 02:11:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:13.887982 | orchestrator | 2026-04-20 02:11:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:13.889734 | orchestrator | 2026-04-20 02:11:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:13.889799 | orchestrator | 2026-04-20 02:11:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:16.940790 | orchestrator | 2026-04-20 02:11:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:16.941844 | orchestrator | 2026-04-20 02:11:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:16.941945 | orchestrator | 2026-04-20 02:11:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:19.990572 | orchestrator | 2026-04-20 02:11:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:19.992036 | orchestrator | 2026-04-20 02:11:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:19.992091 | orchestrator | 2026-04-20 02:11:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:23.035744 | orchestrator | 2026-04-20 02:11:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:23.038559 | orchestrator | 2026-04-20 02:11:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:23.038628 | orchestrator | 2026-04-20 02:11:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:26.083493 | orchestrator | 2026-04-20 02:11:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:26.084453 | orchestrator | 2026-04-20 02:11:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:26.084502 | orchestrator | 2026-04-20 02:11:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:29.135260 | orchestrator | 2026-04-20 02:11:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:29.136062 | orchestrator | 2026-04-20 02:11:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:29.136192 | orchestrator | 2026-04-20 02:11:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:32.177876 | orchestrator | 2026-04-20 02:11:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:32.179792 | orchestrator | 2026-04-20 02:11:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:32.179832 | orchestrator | 2026-04-20 02:11:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:35.230448 | orchestrator | 2026-04-20 02:11:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:35.232825 | orchestrator | 2026-04-20 02:11:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:35.232886 | orchestrator | 2026-04-20 02:11:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:38.288140 | orchestrator | 2026-04-20 02:11:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:38.289862 | orchestrator | 2026-04-20 02:11:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:38.289950 | orchestrator | 2026-04-20 02:11:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:41.336757 | orchestrator | 2026-04-20 02:11:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:41.338887 | orchestrator | 2026-04-20 02:11:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:41.338950 | orchestrator | 2026-04-20 02:11:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:44.381796 | orchestrator | 2026-04-20 02:11:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:44.383541 | orchestrator | 2026-04-20 02:11:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:44.383581 | orchestrator | 2026-04-20 02:11:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:47.430722 | orchestrator | 2026-04-20 02:11:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:47.432888 | orchestrator | 2026-04-20 02:11:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:47.432950 | orchestrator | 2026-04-20 02:11:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:50.482932 | orchestrator | 2026-04-20 02:11:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:50.483696 | orchestrator | 2026-04-20 02:11:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:50.483733 | orchestrator | 2026-04-20 02:11:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:53.533206 | orchestrator | 2026-04-20 02:11:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:53.537089 | orchestrator | 2026-04-20 02:11:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:53.537185 | orchestrator | 2026-04-20 02:11:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:56.579793 | orchestrator | 2026-04-20 02:11:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:56.581493 | orchestrator | 2026-04-20 02:11:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:56.582708 | orchestrator | 2026-04-20 02:11:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:11:59.630193 | orchestrator | 2026-04-20 02:11:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:11:59.632036 | orchestrator | 2026-04-20 02:11:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:11:59.632152 | orchestrator | 2026-04-20 02:11:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:02.680651 | orchestrator | 2026-04-20 02:12:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:02.683229 | orchestrator | 2026-04-20 02:12:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:02.683512 | orchestrator | 2026-04-20 02:12:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:05.732883 | orchestrator | 2026-04-20 02:12:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:05.734471 | orchestrator | 2026-04-20 02:12:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:05.734523 | orchestrator | 2026-04-20 02:12:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:08.788941 | orchestrator | 2026-04-20 02:12:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:08.790624 | orchestrator | 2026-04-20 02:12:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:08.790693 | orchestrator | 2026-04-20 02:12:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:11.838144 | orchestrator | 2026-04-20 02:12:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:11.840416 | orchestrator | 2026-04-20 02:12:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:11.840496 | orchestrator | 2026-04-20 02:12:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:14.885024 | orchestrator | 2026-04-20 02:12:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:14.887479 | orchestrator | 2026-04-20 02:12:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:14.887567 | orchestrator | 2026-04-20 02:12:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:17.934884 | orchestrator | 2026-04-20 02:12:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:17.936652 | orchestrator | 2026-04-20 02:12:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:17.936724 | orchestrator | 2026-04-20 02:12:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:20.984731 | orchestrator | 2026-04-20 02:12:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:20.986979 | orchestrator | 2026-04-20 02:12:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:20.988322 | orchestrator | 2026-04-20 02:12:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:24.031068 | orchestrator | 2026-04-20 02:12:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:24.033579 | orchestrator | 2026-04-20 02:12:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:24.033691 | orchestrator | 2026-04-20 02:12:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:27.079546 | orchestrator | 2026-04-20 02:12:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:27.081166 | orchestrator | 2026-04-20 02:12:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:27.081217 | orchestrator | 2026-04-20 02:12:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:30.126675 | orchestrator | 2026-04-20 02:12:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:30.127576 | orchestrator | 2026-04-20 02:12:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:30.127621 | orchestrator | 2026-04-20 02:12:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:33.174720 | orchestrator | 2026-04-20 02:12:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:33.179100 | orchestrator | 2026-04-20 02:12:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:33.179230 | orchestrator | 2026-04-20 02:12:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:36.226303 | orchestrator | 2026-04-20 02:12:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:36.227475 | orchestrator | 2026-04-20 02:12:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:36.227524 | orchestrator | 2026-04-20 02:12:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:39.272562 | orchestrator | 2026-04-20 02:12:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:39.273738 | orchestrator | 2026-04-20 02:12:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:39.273808 | orchestrator | 2026-04-20 02:12:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:42.321335 | orchestrator | 2026-04-20 02:12:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:42.324006 | orchestrator | 2026-04-20 02:12:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:42.324068 | orchestrator | 2026-04-20 02:12:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:45.373338 | orchestrator | 2026-04-20 02:12:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:45.374601 | orchestrator | 2026-04-20 02:12:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:45.374665 | orchestrator | 2026-04-20 02:12:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:48.420855 | orchestrator | 2026-04-20 02:12:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:48.423306 | orchestrator | 2026-04-20 02:12:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:48.423339 | orchestrator | 2026-04-20 02:12:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:51.462189 | orchestrator | 2026-04-20 02:12:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:51.463675 | orchestrator | 2026-04-20 02:12:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:51.463727 | orchestrator | 2026-04-20 02:12:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:54.517660 | orchestrator | 2026-04-20 02:12:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:54.520091 | orchestrator | 2026-04-20 02:12:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:54.520144 | orchestrator | 2026-04-20 02:12:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:12:57.568517 | orchestrator | 2026-04-20 02:12:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:12:57.570128 | orchestrator | 2026-04-20 02:12:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:12:57.570184 | orchestrator | 2026-04-20 02:12:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:00.618506 | orchestrator | 2026-04-20 02:13:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:00.620691 | orchestrator | 2026-04-20 02:13:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:00.620861 | orchestrator | 2026-04-20 02:13:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:03.668522 | orchestrator | 2026-04-20 02:13:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:03.670766 | orchestrator | 2026-04-20 02:13:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:03.670890 | orchestrator | 2026-04-20 02:13:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:06.719277 | orchestrator | 2026-04-20 02:13:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:06.721424 | orchestrator | 2026-04-20 02:13:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:06.721551 | orchestrator | 2026-04-20 02:13:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:09.772917 | orchestrator | 2026-04-20 02:13:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:09.774967 | orchestrator | 2026-04-20 02:13:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:09.775140 | orchestrator | 2026-04-20 02:13:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:12.823853 | orchestrator | 2026-04-20 02:13:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:12.826116 | orchestrator | 2026-04-20 02:13:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:12.826675 | orchestrator | 2026-04-20 02:13:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:15.878567 | orchestrator | 2026-04-20 02:13:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:15.881252 | orchestrator | 2026-04-20 02:13:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:15.881337 | orchestrator | 2026-04-20 02:13:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:18.928895 | orchestrator | 2026-04-20 02:13:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:18.930379 | orchestrator | 2026-04-20 02:13:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:18.930460 | orchestrator | 2026-04-20 02:13:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:21.975055 | orchestrator | 2026-04-20 02:13:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:21.978270 | orchestrator | 2026-04-20 02:13:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:21.978367 | orchestrator | 2026-04-20 02:13:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:25.026472 | orchestrator | 2026-04-20 02:13:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:25.029227 | orchestrator | 2026-04-20 02:13:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:25.029285 | orchestrator | 2026-04-20 02:13:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:28.069152 | orchestrator | 2026-04-20 02:13:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:28.069534 | orchestrator | 2026-04-20 02:13:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:28.069560 | orchestrator | 2026-04-20 02:13:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:31.114784 | orchestrator | 2026-04-20 02:13:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:31.117193 | orchestrator | 2026-04-20 02:13:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:31.117274 | orchestrator | 2026-04-20 02:13:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:34.163187 | orchestrator | 2026-04-20 02:13:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:34.164357 | orchestrator | 2026-04-20 02:13:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:34.164462 | orchestrator | 2026-04-20 02:13:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:37.217002 | orchestrator | 2026-04-20 02:13:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:37.219070 | orchestrator | 2026-04-20 02:13:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:37.219139 | orchestrator | 2026-04-20 02:13:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:40.269058 | orchestrator | 2026-04-20 02:13:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:40.270525 | orchestrator | 2026-04-20 02:13:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:40.270581 | orchestrator | 2026-04-20 02:13:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:43.321503 | orchestrator | 2026-04-20 02:13:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:43.323498 | orchestrator | 2026-04-20 02:13:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:43.323568 | orchestrator | 2026-04-20 02:13:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:46.372750 | orchestrator | 2026-04-20 02:13:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:46.374509 | orchestrator | 2026-04-20 02:13:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:46.374743 | orchestrator | 2026-04-20 02:13:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:49.426576 | orchestrator | 2026-04-20 02:13:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:49.427083 | orchestrator | 2026-04-20 02:13:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:49.427123 | orchestrator | 2026-04-20 02:13:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:52.470079 | orchestrator | 2026-04-20 02:13:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:52.471884 | orchestrator | 2026-04-20 02:13:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:52.471906 | orchestrator | 2026-04-20 02:13:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:55.521657 | orchestrator | 2026-04-20 02:13:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:55.523818 | orchestrator | 2026-04-20 02:13:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:55.523871 | orchestrator | 2026-04-20 02:13:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:13:58.572317 | orchestrator | 2026-04-20 02:13:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:13:58.574301 | orchestrator | 2026-04-20 02:13:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:13:58.574346 | orchestrator | 2026-04-20 02:13:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:01.615803 | orchestrator | 2026-04-20 02:14:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:01.617493 | orchestrator | 2026-04-20 02:14:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:01.617574 | orchestrator | 2026-04-20 02:14:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:04.666003 | orchestrator | 2026-04-20 02:14:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:04.668255 | orchestrator | 2026-04-20 02:14:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:04.668331 | orchestrator | 2026-04-20 02:14:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:07.717724 | orchestrator | 2026-04-20 02:14:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:07.719607 | orchestrator | 2026-04-20 02:14:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:07.719663 | orchestrator | 2026-04-20 02:14:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:10.771376 | orchestrator | 2026-04-20 02:14:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:10.773288 | orchestrator | 2026-04-20 02:14:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:10.773513 | orchestrator | 2026-04-20 02:14:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:13.824747 | orchestrator | 2026-04-20 02:14:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:13.827849 | orchestrator | 2026-04-20 02:14:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:13.828069 | orchestrator | 2026-04-20 02:14:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:16.878367 | orchestrator | 2026-04-20 02:14:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:16.880214 | orchestrator | 2026-04-20 02:14:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:16.880352 | orchestrator | 2026-04-20 02:14:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:19.931017 | orchestrator | 2026-04-20 02:14:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:19.933247 | orchestrator | 2026-04-20 02:14:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:19.933334 | orchestrator | 2026-04-20 02:14:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:22.981022 | orchestrator | 2026-04-20 02:14:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:22.982825 | orchestrator | 2026-04-20 02:14:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:22.982911 | orchestrator | 2026-04-20 02:14:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:26.035335 | orchestrator | 2026-04-20 02:14:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:26.037173 | orchestrator | 2026-04-20 02:14:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:26.037226 | orchestrator | 2026-04-20 02:14:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:29.079379 | orchestrator | 2026-04-20 02:14:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:29.080819 | orchestrator | 2026-04-20 02:14:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:29.080875 | orchestrator | 2026-04-20 02:14:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:32.130514 | orchestrator | 2026-04-20 02:14:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:32.131697 | orchestrator | 2026-04-20 02:14:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:32.131796 | orchestrator | 2026-04-20 02:14:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:35.176867 | orchestrator | 2026-04-20 02:14:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:35.177805 | orchestrator | 2026-04-20 02:14:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:35.177843 | orchestrator | 2026-04-20 02:14:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:38.221639 | orchestrator | 2026-04-20 02:14:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:38.223270 | orchestrator | 2026-04-20 02:14:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:38.223325 | orchestrator | 2026-04-20 02:14:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:41.270913 | orchestrator | 2026-04-20 02:14:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:41.272266 | orchestrator | 2026-04-20 02:14:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:41.272407 | orchestrator | 2026-04-20 02:14:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:44.320091 | orchestrator | 2026-04-20 02:14:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:44.320896 | orchestrator | 2026-04-20 02:14:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:44.321085 | orchestrator | 2026-04-20 02:14:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:47.374274 | orchestrator | 2026-04-20 02:14:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:47.377563 | orchestrator | 2026-04-20 02:14:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:47.377704 | orchestrator | 2026-04-20 02:14:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:50.425993 | orchestrator | 2026-04-20 02:14:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:50.426499 | orchestrator | 2026-04-20 02:14:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:50.426524 | orchestrator | 2026-04-20 02:14:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:53.472549 | orchestrator | 2026-04-20 02:14:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:53.474587 | orchestrator | 2026-04-20 02:14:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:53.474622 | orchestrator | 2026-04-20 02:14:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:56.525686 | orchestrator | 2026-04-20 02:14:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:56.528422 | orchestrator | 2026-04-20 02:14:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:56.528506 | orchestrator | 2026-04-20 02:14:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:14:59.568024 | orchestrator | 2026-04-20 02:14:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:14:59.568672 | orchestrator | 2026-04-20 02:14:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:14:59.568698 | orchestrator | 2026-04-20 02:14:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:02.612197 | orchestrator | 2026-04-20 02:15:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:02.615186 | orchestrator | 2026-04-20 02:15:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:02.615272 | orchestrator | 2026-04-20 02:15:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:05.664359 | orchestrator | 2026-04-20 02:15:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:05.665921 | orchestrator | 2026-04-20 02:15:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:05.665963 | orchestrator | 2026-04-20 02:15:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:08.709519 | orchestrator | 2026-04-20 02:15:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:08.713702 | orchestrator | 2026-04-20 02:15:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:08.713848 | orchestrator | 2026-04-20 02:15:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:11.753579 | orchestrator | 2026-04-20 02:15:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:11.754828 | orchestrator | 2026-04-20 02:15:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:11.755476 | orchestrator | 2026-04-20 02:15:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:14.797931 | orchestrator | 2026-04-20 02:15:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:14.798976 | orchestrator | 2026-04-20 02:15:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:14.799095 | orchestrator | 2026-04-20 02:15:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:17.843593 | orchestrator | 2026-04-20 02:15:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:17.844621 | orchestrator | 2026-04-20 02:15:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:17.844669 | orchestrator | 2026-04-20 02:15:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:20.890995 | orchestrator | 2026-04-20 02:15:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:20.892703 | orchestrator | 2026-04-20 02:15:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:20.892774 | orchestrator | 2026-04-20 02:15:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:23.937092 | orchestrator | 2026-04-20 02:15:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:23.938410 | orchestrator | 2026-04-20 02:15:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:23.939802 | orchestrator | 2026-04-20 02:15:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:26.987610 | orchestrator | 2026-04-20 02:15:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:26.988084 | orchestrator | 2026-04-20 02:15:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:26.988111 | orchestrator | 2026-04-20 02:15:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:30.028494 | orchestrator | 2026-04-20 02:15:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:30.029947 | orchestrator | 2026-04-20 02:15:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:30.030215 | orchestrator | 2026-04-20 02:15:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:33.074891 | orchestrator | 2026-04-20 02:15:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:33.076962 | orchestrator | 2026-04-20 02:15:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:33.077081 | orchestrator | 2026-04-20 02:15:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:36.111856 | orchestrator | 2026-04-20 02:15:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:36.112905 | orchestrator | 2026-04-20 02:15:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:36.113006 | orchestrator | 2026-04-20 02:15:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:39.159088 | orchestrator | 2026-04-20 02:15:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:39.160622 | orchestrator | 2026-04-20 02:15:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:39.160670 | orchestrator | 2026-04-20 02:15:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:42.210265 | orchestrator | 2026-04-20 02:15:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:42.211790 | orchestrator | 2026-04-20 02:15:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:42.211863 | orchestrator | 2026-04-20 02:15:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:45.262731 | orchestrator | 2026-04-20 02:15:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:45.264104 | orchestrator | 2026-04-20 02:15:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:45.264187 | orchestrator | 2026-04-20 02:15:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:48.310680 | orchestrator | 2026-04-20 02:15:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:48.312771 | orchestrator | 2026-04-20 02:15:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:48.312827 | orchestrator | 2026-04-20 02:15:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:51.366132 | orchestrator | 2026-04-20 02:15:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:51.367950 | orchestrator | 2026-04-20 02:15:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:51.368005 | orchestrator | 2026-04-20 02:15:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:54.413333 | orchestrator | 2026-04-20 02:15:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:54.415877 | orchestrator | 2026-04-20 02:15:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:54.415938 | orchestrator | 2026-04-20 02:15:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:15:57.458247 | orchestrator | 2026-04-20 02:15:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:15:57.459637 | orchestrator | 2026-04-20 02:15:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:15:57.459675 | orchestrator | 2026-04-20 02:15:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:00.508861 | orchestrator | 2026-04-20 02:16:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:00.512205 | orchestrator | 2026-04-20 02:16:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:00.512627 | orchestrator | 2026-04-20 02:16:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:03.558198 | orchestrator | 2026-04-20 02:16:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:03.559218 | orchestrator | 2026-04-20 02:16:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:03.559269 | orchestrator | 2026-04-20 02:16:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:06.605769 | orchestrator | 2026-04-20 02:16:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:06.607341 | orchestrator | 2026-04-20 02:16:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:06.607418 | orchestrator | 2026-04-20 02:16:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:09.650208 | orchestrator | 2026-04-20 02:16:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:09.653155 | orchestrator | 2026-04-20 02:16:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:09.653219 | orchestrator | 2026-04-20 02:16:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:12.692289 | orchestrator | 2026-04-20 02:16:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:12.694993 | orchestrator | 2026-04-20 02:16:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:12.695088 | orchestrator | 2026-04-20 02:16:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:15.743797 | orchestrator | 2026-04-20 02:16:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:15.745537 | orchestrator | 2026-04-20 02:16:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:15.745621 | orchestrator | 2026-04-20 02:16:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:18.790941 | orchestrator | 2026-04-20 02:16:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:18.792925 | orchestrator | 2026-04-20 02:16:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:18.792993 | orchestrator | 2026-04-20 02:16:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:21.835322 | orchestrator | 2026-04-20 02:16:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:21.837925 | orchestrator | 2026-04-20 02:16:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:21.837989 | orchestrator | 2026-04-20 02:16:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:24.873983 | orchestrator | 2026-04-20 02:16:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:24.876322 | orchestrator | 2026-04-20 02:16:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:24.876413 | orchestrator | 2026-04-20 02:16:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:27.925022 | orchestrator | 2026-04-20 02:16:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:27.925210 | orchestrator | 2026-04-20 02:16:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:27.925232 | orchestrator | 2026-04-20 02:16:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:30.974230 | orchestrator | 2026-04-20 02:16:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:30.975599 | orchestrator | 2026-04-20 02:16:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:30.975633 | orchestrator | 2026-04-20 02:16:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:34.028808 | orchestrator | 2026-04-20 02:16:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:34.030397 | orchestrator | 2026-04-20 02:16:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:34.030520 | orchestrator | 2026-04-20 02:16:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:37.077723 | orchestrator | 2026-04-20 02:16:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:37.078713 | orchestrator | 2026-04-20 02:16:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:37.078771 | orchestrator | 2026-04-20 02:16:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:40.122830 | orchestrator | 2026-04-20 02:16:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:40.123691 | orchestrator | 2026-04-20 02:16:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:40.123779 | orchestrator | 2026-04-20 02:16:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:43.180032 | orchestrator | 2026-04-20 02:16:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:43.182437 | orchestrator | 2026-04-20 02:16:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:43.182545 | orchestrator | 2026-04-20 02:16:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:46.232648 | orchestrator | 2026-04-20 02:16:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:46.233410 | orchestrator | 2026-04-20 02:16:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:46.233686 | orchestrator | 2026-04-20 02:16:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:49.282692 | orchestrator | 2026-04-20 02:16:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:49.284215 | orchestrator | 2026-04-20 02:16:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:49.284264 | orchestrator | 2026-04-20 02:16:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:52.331261 | orchestrator | 2026-04-20 02:16:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:52.333790 | orchestrator | 2026-04-20 02:16:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:52.333880 | orchestrator | 2026-04-20 02:16:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:55.382949 | orchestrator | 2026-04-20 02:16:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:55.383850 | orchestrator | 2026-04-20 02:16:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:55.383869 | orchestrator | 2026-04-20 02:16:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:16:58.431860 | orchestrator | 2026-04-20 02:16:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:16:58.433114 | orchestrator | 2026-04-20 02:16:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:16:58.433156 | orchestrator | 2026-04-20 02:16:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:01.485512 | orchestrator | 2026-04-20 02:17:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:01.487627 | orchestrator | 2026-04-20 02:17:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:01.487661 | orchestrator | 2026-04-20 02:17:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:04.536458 | orchestrator | 2026-04-20 02:17:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:04.537645 | orchestrator | 2026-04-20 02:17:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:04.537720 | orchestrator | 2026-04-20 02:17:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:07.585010 | orchestrator | 2026-04-20 02:17:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:07.586563 | orchestrator | 2026-04-20 02:17:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:07.586603 | orchestrator | 2026-04-20 02:17:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:10.633485 | orchestrator | 2026-04-20 02:17:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:10.635044 | orchestrator | 2026-04-20 02:17:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:10.635120 | orchestrator | 2026-04-20 02:17:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:13.681752 | orchestrator | 2026-04-20 02:17:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:13.683608 | orchestrator | 2026-04-20 02:17:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:13.683653 | orchestrator | 2026-04-20 02:17:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:16.730066 | orchestrator | 2026-04-20 02:17:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:16.732449 | orchestrator | 2026-04-20 02:17:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:16.732533 | orchestrator | 2026-04-20 02:17:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:19.771097 | orchestrator | 2026-04-20 02:17:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:19.772615 | orchestrator | 2026-04-20 02:17:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:19.772700 | orchestrator | 2026-04-20 02:17:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:22.815257 | orchestrator | 2026-04-20 02:17:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:22.818621 | orchestrator | 2026-04-20 02:17:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:22.818693 | orchestrator | 2026-04-20 02:17:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:25.875351 | orchestrator | 2026-04-20 02:17:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:25.884124 | orchestrator | 2026-04-20 02:17:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:25.884196 | orchestrator | 2026-04-20 02:17:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:28.924299 | orchestrator | 2026-04-20 02:17:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:28.925625 | orchestrator | 2026-04-20 02:17:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:28.925692 | orchestrator | 2026-04-20 02:17:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:31.978443 | orchestrator | 2026-04-20 02:17:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:31.980264 | orchestrator | 2026-04-20 02:17:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:31.980326 | orchestrator | 2026-04-20 02:17:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:35.033955 | orchestrator | 2026-04-20 02:17:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:35.035052 | orchestrator | 2026-04-20 02:17:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:35.035657 | orchestrator | 2026-04-20 02:17:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:38.083474 | orchestrator | 2026-04-20 02:17:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:38.085421 | orchestrator | 2026-04-20 02:17:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:38.085655 | orchestrator | 2026-04-20 02:17:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:41.138552 | orchestrator | 2026-04-20 02:17:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:41.141724 | orchestrator | 2026-04-20 02:17:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:41.141793 | orchestrator | 2026-04-20 02:17:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:44.188323 | orchestrator | 2026-04-20 02:17:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:44.190193 | orchestrator | 2026-04-20 02:17:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:44.190230 | orchestrator | 2026-04-20 02:17:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:47.239363 | orchestrator | 2026-04-20 02:17:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:47.241718 | orchestrator | 2026-04-20 02:17:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:47.242140 | orchestrator | 2026-04-20 02:17:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:50.292382 | orchestrator | 2026-04-20 02:17:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:50.292935 | orchestrator | 2026-04-20 02:17:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:50.292985 | orchestrator | 2026-04-20 02:17:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:53.338260 | orchestrator | 2026-04-20 02:17:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:53.339954 | orchestrator | 2026-04-20 02:17:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:53.340013 | orchestrator | 2026-04-20 02:17:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:56.388596 | orchestrator | 2026-04-20 02:17:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:56.389988 | orchestrator | 2026-04-20 02:17:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:56.390113 | orchestrator | 2026-04-20 02:17:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:17:59.439325 | orchestrator | 2026-04-20 02:17:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:17:59.441818 | orchestrator | 2026-04-20 02:17:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:17:59.441945 | orchestrator | 2026-04-20 02:17:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:02.485685 | orchestrator | 2026-04-20 02:18:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:02.487706 | orchestrator | 2026-04-20 02:18:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:02.487790 | orchestrator | 2026-04-20 02:18:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:05.534625 | orchestrator | 2026-04-20 02:18:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:05.536685 | orchestrator | 2026-04-20 02:18:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:05.536816 | orchestrator | 2026-04-20 02:18:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:08.580062 | orchestrator | 2026-04-20 02:18:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:08.582054 | orchestrator | 2026-04-20 02:18:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:08.582115 | orchestrator | 2026-04-20 02:18:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:11.625443 | orchestrator | 2026-04-20 02:18:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:11.626778 | orchestrator | 2026-04-20 02:18:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:11.626853 | orchestrator | 2026-04-20 02:18:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:14.670646 | orchestrator | 2026-04-20 02:18:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:14.672138 | orchestrator | 2026-04-20 02:18:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:14.672201 | orchestrator | 2026-04-20 02:18:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:17.724035 | orchestrator | 2026-04-20 02:18:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:17.725673 | orchestrator | 2026-04-20 02:18:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:17.725741 | orchestrator | 2026-04-20 02:18:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:20.776544 | orchestrator | 2026-04-20 02:18:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:20.779889 | orchestrator | 2026-04-20 02:18:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:20.779970 | orchestrator | 2026-04-20 02:18:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:23.825765 | orchestrator | 2026-04-20 02:18:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:23.828064 | orchestrator | 2026-04-20 02:18:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:23.828111 | orchestrator | 2026-04-20 02:18:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:26.875269 | orchestrator | 2026-04-20 02:18:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:26.876747 | orchestrator | 2026-04-20 02:18:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:26.876780 | orchestrator | 2026-04-20 02:18:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:29.926435 | orchestrator | 2026-04-20 02:18:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:29.926704 | orchestrator | 2026-04-20 02:18:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:29.926740 | orchestrator | 2026-04-20 02:18:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:32.970345 | orchestrator | 2026-04-20 02:18:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:32.971743 | orchestrator | 2026-04-20 02:18:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:32.971795 | orchestrator | 2026-04-20 02:18:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:36.019757 | orchestrator | 2026-04-20 02:18:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:36.021288 | orchestrator | 2026-04-20 02:18:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:36.021453 | orchestrator | 2026-04-20 02:18:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:39.071725 | orchestrator | 2026-04-20 02:18:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:39.072906 | orchestrator | 2026-04-20 02:18:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:39.073022 | orchestrator | 2026-04-20 02:18:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:42.120808 | orchestrator | 2026-04-20 02:18:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:42.122805 | orchestrator | 2026-04-20 02:18:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:42.122874 | orchestrator | 2026-04-20 02:18:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:45.171179 | orchestrator | 2026-04-20 02:18:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:45.172720 | orchestrator | 2026-04-20 02:18:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:45.172760 | orchestrator | 2026-04-20 02:18:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:48.222002 | orchestrator | 2026-04-20 02:18:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:48.224491 | orchestrator | 2026-04-20 02:18:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:48.224591 | orchestrator | 2026-04-20 02:18:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:51.277446 | orchestrator | 2026-04-20 02:18:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:51.278741 | orchestrator | 2026-04-20 02:18:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:51.278800 | orchestrator | 2026-04-20 02:18:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:54.331482 | orchestrator | 2026-04-20 02:18:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:54.332414 | orchestrator | 2026-04-20 02:18:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:54.332460 | orchestrator | 2026-04-20 02:18:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:18:57.380709 | orchestrator | 2026-04-20 02:18:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:18:57.381800 | orchestrator | 2026-04-20 02:18:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:18:57.381834 | orchestrator | 2026-04-20 02:18:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:00.440594 | orchestrator | 2026-04-20 02:19:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:00.442861 | orchestrator | 2026-04-20 02:19:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:00.442923 | orchestrator | 2026-04-20 02:19:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:03.493431 | orchestrator | 2026-04-20 02:19:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:03.494864 | orchestrator | 2026-04-20 02:19:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:03.495035 | orchestrator | 2026-04-20 02:19:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:06.546162 | orchestrator | 2026-04-20 02:19:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:06.546450 | orchestrator | 2026-04-20 02:19:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:06.546485 | orchestrator | 2026-04-20 02:19:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:09.593334 | orchestrator | 2026-04-20 02:19:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:09.594923 | orchestrator | 2026-04-20 02:19:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:09.594995 | orchestrator | 2026-04-20 02:19:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:12.646783 | orchestrator | 2026-04-20 02:19:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:12.649419 | orchestrator | 2026-04-20 02:19:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:12.649537 | orchestrator | 2026-04-20 02:19:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:15.699309 | orchestrator | 2026-04-20 02:19:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:15.700194 | orchestrator | 2026-04-20 02:19:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:15.700241 | orchestrator | 2026-04-20 02:19:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:18.748401 | orchestrator | 2026-04-20 02:19:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:18.750612 | orchestrator | 2026-04-20 02:19:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:18.750681 | orchestrator | 2026-04-20 02:19:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:21.794511 | orchestrator | 2026-04-20 02:19:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:21.796776 | orchestrator | 2026-04-20 02:19:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:21.796838 | orchestrator | 2026-04-20 02:19:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:24.843003 | orchestrator | 2026-04-20 02:19:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:24.844988 | orchestrator | 2026-04-20 02:19:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:24.845055 | orchestrator | 2026-04-20 02:19:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:27.893020 | orchestrator | 2026-04-20 02:19:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:27.894241 | orchestrator | 2026-04-20 02:19:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:27.894284 | orchestrator | 2026-04-20 02:19:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:30.946522 | orchestrator | 2026-04-20 02:19:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:30.947733 | orchestrator | 2026-04-20 02:19:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:30.948049 | orchestrator | 2026-04-20 02:19:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:34.005409 | orchestrator | 2026-04-20 02:19:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:34.006846 | orchestrator | 2026-04-20 02:19:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:34.006928 | orchestrator | 2026-04-20 02:19:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:37.061508 | orchestrator | 2026-04-20 02:19:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:37.063021 | orchestrator | 2026-04-20 02:19:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:37.063095 | orchestrator | 2026-04-20 02:19:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:40.105838 | orchestrator | 2026-04-20 02:19:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:40.109110 | orchestrator | 2026-04-20 02:19:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:40.109220 | orchestrator | 2026-04-20 02:19:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:43.153413 | orchestrator | 2026-04-20 02:19:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:43.154260 | orchestrator | 2026-04-20 02:19:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:43.154478 | orchestrator | 2026-04-20 02:19:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:46.204684 | orchestrator | 2026-04-20 02:19:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:46.206211 | orchestrator | 2026-04-20 02:19:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:46.206263 | orchestrator | 2026-04-20 02:19:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:49.258674 | orchestrator | 2026-04-20 02:19:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:49.259764 | orchestrator | 2026-04-20 02:19:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:49.259873 | orchestrator | 2026-04-20 02:19:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:52.307229 | orchestrator | 2026-04-20 02:19:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:52.308817 | orchestrator | 2026-04-20 02:19:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:52.308899 | orchestrator | 2026-04-20 02:19:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:55.359557 | orchestrator | 2026-04-20 02:19:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:55.361091 | orchestrator | 2026-04-20 02:19:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:55.361166 | orchestrator | 2026-04-20 02:19:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:19:58.406440 | orchestrator | 2026-04-20 02:19:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:19:58.407802 | orchestrator | 2026-04-20 02:19:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:19:58.407841 | orchestrator | 2026-04-20 02:19:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:01.460382 | orchestrator | 2026-04-20 02:20:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:01.461698 | orchestrator | 2026-04-20 02:20:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:01.461771 | orchestrator | 2026-04-20 02:20:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:04.511655 | orchestrator | 2026-04-20 02:20:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:04.512559 | orchestrator | 2026-04-20 02:20:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:04.512635 | orchestrator | 2026-04-20 02:20:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:07.559368 | orchestrator | 2026-04-20 02:20:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:07.561052 | orchestrator | 2026-04-20 02:20:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:07.561175 | orchestrator | 2026-04-20 02:20:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:10.609050 | orchestrator | 2026-04-20 02:20:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:10.610750 | orchestrator | 2026-04-20 02:20:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:10.610788 | orchestrator | 2026-04-20 02:20:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:13.662063 | orchestrator | 2026-04-20 02:20:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:13.662680 | orchestrator | 2026-04-20 02:20:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:13.662706 | orchestrator | 2026-04-20 02:20:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:16.715219 | orchestrator | 2026-04-20 02:20:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:16.717130 | orchestrator | 2026-04-20 02:20:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:16.717190 | orchestrator | 2026-04-20 02:20:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:19.768215 | orchestrator | 2026-04-20 02:20:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:19.770700 | orchestrator | 2026-04-20 02:20:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:19.770802 | orchestrator | 2026-04-20 02:20:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:22.820546 | orchestrator | 2026-04-20 02:20:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:22.823419 | orchestrator | 2026-04-20 02:20:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:22.823529 | orchestrator | 2026-04-20 02:20:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:25.870252 | orchestrator | 2026-04-20 02:20:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:25.871980 | orchestrator | 2026-04-20 02:20:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:25.872039 | orchestrator | 2026-04-20 02:20:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:28.918263 | orchestrator | 2026-04-20 02:20:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:28.919776 | orchestrator | 2026-04-20 02:20:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:28.919820 | orchestrator | 2026-04-20 02:20:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:31.968063 | orchestrator | 2026-04-20 02:20:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:31.969501 | orchestrator | 2026-04-20 02:20:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:31.969539 | orchestrator | 2026-04-20 02:20:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:35.015421 | orchestrator | 2026-04-20 02:20:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:35.017245 | orchestrator | 2026-04-20 02:20:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:35.017398 | orchestrator | 2026-04-20 02:20:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:38.063467 | orchestrator | 2026-04-20 02:20:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:38.065240 | orchestrator | 2026-04-20 02:20:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:38.065303 | orchestrator | 2026-04-20 02:20:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:41.109544 | orchestrator | 2026-04-20 02:20:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:41.111519 | orchestrator | 2026-04-20 02:20:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:41.111572 | orchestrator | 2026-04-20 02:20:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:44.155859 | orchestrator | 2026-04-20 02:20:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:44.156685 | orchestrator | 2026-04-20 02:20:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:44.156735 | orchestrator | 2026-04-20 02:20:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:47.216975 | orchestrator | 2026-04-20 02:20:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:47.218924 | orchestrator | 2026-04-20 02:20:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:47.218971 | orchestrator | 2026-04-20 02:20:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:50.273460 | orchestrator | 2026-04-20 02:20:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:50.274906 | orchestrator | 2026-04-20 02:20:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:50.274985 | orchestrator | 2026-04-20 02:20:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:53.321619 | orchestrator | 2026-04-20 02:20:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:53.322994 | orchestrator | 2026-04-20 02:20:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:53.323059 | orchestrator | 2026-04-20 02:20:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:56.369461 | orchestrator | 2026-04-20 02:20:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:56.371838 | orchestrator | 2026-04-20 02:20:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:56.371905 | orchestrator | 2026-04-20 02:20:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:20:59.412797 | orchestrator | 2026-04-20 02:20:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:20:59.415089 | orchestrator | 2026-04-20 02:20:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:20:59.415142 | orchestrator | 2026-04-20 02:20:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:02.466962 | orchestrator | 2026-04-20 02:21:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:02.468961 | orchestrator | 2026-04-20 02:21:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:02.469017 | orchestrator | 2026-04-20 02:21:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:05.525415 | orchestrator | 2026-04-20 02:21:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:05.527049 | orchestrator | 2026-04-20 02:21:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:05.527108 | orchestrator | 2026-04-20 02:21:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:08.577148 | orchestrator | 2026-04-20 02:21:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:08.579220 | orchestrator | 2026-04-20 02:21:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:08.579399 | orchestrator | 2026-04-20 02:21:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:11.627373 | orchestrator | 2026-04-20 02:21:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:11.629089 | orchestrator | 2026-04-20 02:21:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:11.629193 | orchestrator | 2026-04-20 02:21:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:14.676921 | orchestrator | 2026-04-20 02:21:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:14.678465 | orchestrator | 2026-04-20 02:21:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:14.678846 | orchestrator | 2026-04-20 02:21:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:17.730221 | orchestrator | 2026-04-20 02:21:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:17.731949 | orchestrator | 2026-04-20 02:21:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:17.731971 | orchestrator | 2026-04-20 02:21:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:20.779872 | orchestrator | 2026-04-20 02:21:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:20.781917 | orchestrator | 2026-04-20 02:21:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:20.782379 | orchestrator | 2026-04-20 02:21:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:23.824495 | orchestrator | 2026-04-20 02:21:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:23.825928 | orchestrator | 2026-04-20 02:21:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:23.825954 | orchestrator | 2026-04-20 02:21:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:26.871763 | orchestrator | 2026-04-20 02:21:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:26.872440 | orchestrator | 2026-04-20 02:21:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:26.872485 | orchestrator | 2026-04-20 02:21:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:29.923925 | orchestrator | 2026-04-20 02:21:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:29.926744 | orchestrator | 2026-04-20 02:21:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:29.926823 | orchestrator | 2026-04-20 02:21:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:32.972435 | orchestrator | 2026-04-20 02:21:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:32.974098 | orchestrator | 2026-04-20 02:21:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:32.974122 | orchestrator | 2026-04-20 02:21:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:36.021951 | orchestrator | 2026-04-20 02:21:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:36.023863 | orchestrator | 2026-04-20 02:21:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:36.023952 | orchestrator | 2026-04-20 02:21:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:39.059718 | orchestrator | 2026-04-20 02:21:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:39.061786 | orchestrator | 2026-04-20 02:21:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:39.061878 | orchestrator | 2026-04-20 02:21:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:42.103811 | orchestrator | 2026-04-20 02:21:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:42.105306 | orchestrator | 2026-04-20 02:21:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:42.105505 | orchestrator | 2026-04-20 02:21:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:45.156570 | orchestrator | 2026-04-20 02:21:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:45.157887 | orchestrator | 2026-04-20 02:21:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:45.157912 | orchestrator | 2026-04-20 02:21:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:48.199275 | orchestrator | 2026-04-20 02:21:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:48.200951 | orchestrator | 2026-04-20 02:21:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:48.200985 | orchestrator | 2026-04-20 02:21:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:51.249440 | orchestrator | 2026-04-20 02:21:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:51.253227 | orchestrator | 2026-04-20 02:21:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:51.253295 | orchestrator | 2026-04-20 02:21:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:54.302426 | orchestrator | 2026-04-20 02:21:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:54.304066 | orchestrator | 2026-04-20 02:21:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:54.304118 | orchestrator | 2026-04-20 02:21:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:21:57.353335 | orchestrator | 2026-04-20 02:21:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:21:57.355623 | orchestrator | 2026-04-20 02:21:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:21:57.355704 | orchestrator | 2026-04-20 02:21:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:22:00.408009 | orchestrator | 2026-04-20 02:22:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:22:00.410285 | orchestrator | 2026-04-20 02:22:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:22:00.410343 | orchestrator | 2026-04-20 02:22:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:22:03.466099 | orchestrator | 2026-04-20 02:22:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:22:03.468436 | orchestrator | 2026-04-20 02:22:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:22:03.468495 | orchestrator | 2026-04-20 02:22:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:06.621802 | orchestrator | 2026-04-20 02:24:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:06.621916 | orchestrator | 2026-04-20 02:24:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:06.621931 | orchestrator | 2026-04-20 02:24:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:09.675015 | orchestrator | 2026-04-20 02:24:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:09.677247 | orchestrator | 2026-04-20 02:24:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:09.677295 | orchestrator | 2026-04-20 02:24:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:12.729070 | orchestrator | 2026-04-20 02:24:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:12.731765 | orchestrator | 2026-04-20 02:24:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:12.731806 | orchestrator | 2026-04-20 02:24:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:15.786763 | orchestrator | 2026-04-20 02:24:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:15.788779 | orchestrator | 2026-04-20 02:24:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:15.788856 | orchestrator | 2026-04-20 02:24:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:18.825884 | orchestrator | 2026-04-20 02:24:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:18.827740 | orchestrator | 2026-04-20 02:24:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:18.827807 | orchestrator | 2026-04-20 02:24:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:21.875043 | orchestrator | 2026-04-20 02:24:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:21.876366 | orchestrator | 2026-04-20 02:24:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:21.876484 | orchestrator | 2026-04-20 02:24:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:24.920846 | orchestrator | 2026-04-20 02:24:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:24.922081 | orchestrator | 2026-04-20 02:24:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:24.922118 | orchestrator | 2026-04-20 02:24:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:27.973785 | orchestrator | 2026-04-20 02:24:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:27.975403 | orchestrator | 2026-04-20 02:24:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:27.975493 | orchestrator | 2026-04-20 02:24:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:31.031352 | orchestrator | 2026-04-20 02:24:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:31.031758 | orchestrator | 2026-04-20 02:24:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:31.031903 | orchestrator | 2026-04-20 02:24:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:34.080095 | orchestrator | 2026-04-20 02:24:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:34.080856 | orchestrator | 2026-04-20 02:24:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:34.080917 | orchestrator | 2026-04-20 02:24:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:37.135496 | orchestrator | 2026-04-20 02:24:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:37.136954 | orchestrator | 2026-04-20 02:24:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:37.137009 | orchestrator | 2026-04-20 02:24:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:40.185534 | orchestrator | 2026-04-20 02:24:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:40.187066 | orchestrator | 2026-04-20 02:24:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:40.187104 | orchestrator | 2026-04-20 02:24:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:43.232636 | orchestrator | 2026-04-20 02:24:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:43.234303 | orchestrator | 2026-04-20 02:24:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:43.234354 | orchestrator | 2026-04-20 02:24:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:46.280624 | orchestrator | 2026-04-20 02:24:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:46.282265 | orchestrator | 2026-04-20 02:24:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:46.282446 | orchestrator | 2026-04-20 02:24:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:49.330524 | orchestrator | 2026-04-20 02:24:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:49.332696 | orchestrator | 2026-04-20 02:24:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:49.332731 | orchestrator | 2026-04-20 02:24:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:52.382071 | orchestrator | 2026-04-20 02:24:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:52.384090 | orchestrator | 2026-04-20 02:24:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:52.384256 | orchestrator | 2026-04-20 02:24:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:55.428368 | orchestrator | 2026-04-20 02:24:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:55.428720 | orchestrator | 2026-04-20 02:24:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:55.428750 | orchestrator | 2026-04-20 02:24:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:24:58.476093 | orchestrator | 2026-04-20 02:24:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:24:58.476893 | orchestrator | 2026-04-20 02:24:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:24:58.476935 | orchestrator | 2026-04-20 02:24:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:01.526619 | orchestrator | 2026-04-20 02:25:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:01.528307 | orchestrator | 2026-04-20 02:25:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:01.528353 | orchestrator | 2026-04-20 02:25:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:04.577961 | orchestrator | 2026-04-20 02:25:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:04.580643 | orchestrator | 2026-04-20 02:25:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:04.580781 | orchestrator | 2026-04-20 02:25:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:07.631474 | orchestrator | 2026-04-20 02:25:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:07.633458 | orchestrator | 2026-04-20 02:25:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:07.633506 | orchestrator | 2026-04-20 02:25:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:10.681512 | orchestrator | 2026-04-20 02:25:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:10.681970 | orchestrator | 2026-04-20 02:25:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:10.682070 | orchestrator | 2026-04-20 02:25:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:13.736224 | orchestrator | 2026-04-20 02:25:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:13.738358 | orchestrator | 2026-04-20 02:25:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:13.738470 | orchestrator | 2026-04-20 02:25:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:16.787711 | orchestrator | 2026-04-20 02:25:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:16.789210 | orchestrator | 2026-04-20 02:25:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:16.789316 | orchestrator | 2026-04-20 02:25:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:19.842733 | orchestrator | 2026-04-20 02:25:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:19.845178 | orchestrator | 2026-04-20 02:25:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:19.845258 | orchestrator | 2026-04-20 02:25:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:22.884991 | orchestrator | 2026-04-20 02:25:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:22.886799 | orchestrator | 2026-04-20 02:25:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:22.886874 | orchestrator | 2026-04-20 02:25:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:25.924331 | orchestrator | 2026-04-20 02:25:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:25.926242 | orchestrator | 2026-04-20 02:25:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:25.926304 | orchestrator | 2026-04-20 02:25:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:28.975434 | orchestrator | 2026-04-20 02:25:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:28.977253 | orchestrator | 2026-04-20 02:25:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:28.977343 | orchestrator | 2026-04-20 02:25:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:32.029329 | orchestrator | 2026-04-20 02:25:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:32.031728 | orchestrator | 2026-04-20 02:25:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:32.031769 | orchestrator | 2026-04-20 02:25:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:35.078505 | orchestrator | 2026-04-20 02:25:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:35.079878 | orchestrator | 2026-04-20 02:25:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:35.079936 | orchestrator | 2026-04-20 02:25:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:38.128204 | orchestrator | 2026-04-20 02:25:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:38.130798 | orchestrator | 2026-04-20 02:25:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:38.130838 | orchestrator | 2026-04-20 02:25:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:41.185732 | orchestrator | 2026-04-20 02:25:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:41.187475 | orchestrator | 2026-04-20 02:25:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:41.187530 | orchestrator | 2026-04-20 02:25:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:44.240221 | orchestrator | 2026-04-20 02:25:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:44.242783 | orchestrator | 2026-04-20 02:25:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:44.242809 | orchestrator | 2026-04-20 02:25:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:47.296111 | orchestrator | 2026-04-20 02:25:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:47.298221 | orchestrator | 2026-04-20 02:25:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:47.298488 | orchestrator | 2026-04-20 02:25:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:50.347410 | orchestrator | 2026-04-20 02:25:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:50.349768 | orchestrator | 2026-04-20 02:25:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:50.349819 | orchestrator | 2026-04-20 02:25:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:53.401489 | orchestrator | 2026-04-20 02:25:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:53.403493 | orchestrator | 2026-04-20 02:25:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:53.403593 | orchestrator | 2026-04-20 02:25:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:56.456098 | orchestrator | 2026-04-20 02:25:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:56.457686 | orchestrator | 2026-04-20 02:25:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:56.457966 | orchestrator | 2026-04-20 02:25:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:25:59.518467 | orchestrator | 2026-04-20 02:25:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:25:59.520558 | orchestrator | 2026-04-20 02:25:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:25:59.520699 | orchestrator | 2026-04-20 02:25:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:02.568088 | orchestrator | 2026-04-20 02:26:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:02.569563 | orchestrator | 2026-04-20 02:26:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:02.569795 | orchestrator | 2026-04-20 02:26:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:05.618949 | orchestrator | 2026-04-20 02:26:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:05.621028 | orchestrator | 2026-04-20 02:26:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:05.621076 | orchestrator | 2026-04-20 02:26:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:08.667968 | orchestrator | 2026-04-20 02:26:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:08.670541 | orchestrator | 2026-04-20 02:26:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:08.670595 | orchestrator | 2026-04-20 02:26:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:11.715965 | orchestrator | 2026-04-20 02:26:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:11.718282 | orchestrator | 2026-04-20 02:26:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:11.718337 | orchestrator | 2026-04-20 02:26:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:14.766214 | orchestrator | 2026-04-20 02:26:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:14.768747 | orchestrator | 2026-04-20 02:26:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:14.768908 | orchestrator | 2026-04-20 02:26:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:17.817264 | orchestrator | 2026-04-20 02:26:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:17.819374 | orchestrator | 2026-04-20 02:26:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:17.819435 | orchestrator | 2026-04-20 02:26:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:20.882264 | orchestrator | 2026-04-20 02:26:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:20.884207 | orchestrator | 2026-04-20 02:26:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:20.884253 | orchestrator | 2026-04-20 02:26:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:23.934339 | orchestrator | 2026-04-20 02:26:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:23.937092 | orchestrator | 2026-04-20 02:26:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:23.937210 | orchestrator | 2026-04-20 02:26:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:26.978330 | orchestrator | 2026-04-20 02:26:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:26.979867 | orchestrator | 2026-04-20 02:26:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:26.979919 | orchestrator | 2026-04-20 02:26:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:30.032260 | orchestrator | 2026-04-20 02:26:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:30.032403 | orchestrator | 2026-04-20 02:26:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:30.032419 | orchestrator | 2026-04-20 02:26:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:33.077117 | orchestrator | 2026-04-20 02:26:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:33.079660 | orchestrator | 2026-04-20 02:26:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:33.079780 | orchestrator | 2026-04-20 02:26:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:36.129042 | orchestrator | 2026-04-20 02:26:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:36.131465 | orchestrator | 2026-04-20 02:26:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:36.131552 | orchestrator | 2026-04-20 02:26:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:39.181692 | orchestrator | 2026-04-20 02:26:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:39.182682 | orchestrator | 2026-04-20 02:26:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:39.182704 | orchestrator | 2026-04-20 02:26:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:42.228591 | orchestrator | 2026-04-20 02:26:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:42.230905 | orchestrator | 2026-04-20 02:26:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:42.231018 | orchestrator | 2026-04-20 02:26:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:45.281846 | orchestrator | 2026-04-20 02:26:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:45.282534 | orchestrator | 2026-04-20 02:26:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:45.282626 | orchestrator | 2026-04-20 02:26:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:48.327905 | orchestrator | 2026-04-20 02:26:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:48.329392 | orchestrator | 2026-04-20 02:26:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:48.329463 | orchestrator | 2026-04-20 02:26:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:51.374969 | orchestrator | 2026-04-20 02:26:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:51.376304 | orchestrator | 2026-04-20 02:26:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:51.376414 | orchestrator | 2026-04-20 02:26:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:54.424642 | orchestrator | 2026-04-20 02:26:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:54.426510 | orchestrator | 2026-04-20 02:26:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:54.426638 | orchestrator | 2026-04-20 02:26:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:26:57.479306 | orchestrator | 2026-04-20 02:26:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:26:57.481543 | orchestrator | 2026-04-20 02:26:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:26:57.481597 | orchestrator | 2026-04-20 02:26:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:00.528540 | orchestrator | 2026-04-20 02:27:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:00.529656 | orchestrator | 2026-04-20 02:27:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:00.529689 | orchestrator | 2026-04-20 02:27:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:03.575690 | orchestrator | 2026-04-20 02:27:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:03.577081 | orchestrator | 2026-04-20 02:27:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:03.577165 | orchestrator | 2026-04-20 02:27:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:06.624257 | orchestrator | 2026-04-20 02:27:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:06.625846 | orchestrator | 2026-04-20 02:27:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:06.625964 | orchestrator | 2026-04-20 02:27:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:09.674084 | orchestrator | 2026-04-20 02:27:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:09.676062 | orchestrator | 2026-04-20 02:27:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:09.676147 | orchestrator | 2026-04-20 02:27:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:12.720005 | orchestrator | 2026-04-20 02:27:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:12.721748 | orchestrator | 2026-04-20 02:27:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:12.721875 | orchestrator | 2026-04-20 02:27:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:15.772864 | orchestrator | 2026-04-20 02:27:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:15.774628 | orchestrator | 2026-04-20 02:27:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:15.774696 | orchestrator | 2026-04-20 02:27:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:18.825796 | orchestrator | 2026-04-20 02:27:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:18.828367 | orchestrator | 2026-04-20 02:27:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:18.828418 | orchestrator | 2026-04-20 02:27:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:21.875467 | orchestrator | 2026-04-20 02:27:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:21.876724 | orchestrator | 2026-04-20 02:27:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:21.876949 | orchestrator | 2026-04-20 02:27:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:24.919034 | orchestrator | 2026-04-20 02:27:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:24.921033 | orchestrator | 2026-04-20 02:27:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:24.921153 | orchestrator | 2026-04-20 02:27:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:27.967600 | orchestrator | 2026-04-20 02:27:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:27.969906 | orchestrator | 2026-04-20 02:27:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:27.970135 | orchestrator | 2026-04-20 02:27:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:31.021924 | orchestrator | 2026-04-20 02:27:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:31.023600 | orchestrator | 2026-04-20 02:27:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:31.023635 | orchestrator | 2026-04-20 02:27:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:34.076992 | orchestrator | 2026-04-20 02:27:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:34.078452 | orchestrator | 2026-04-20 02:27:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:34.078514 | orchestrator | 2026-04-20 02:27:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:37.126146 | orchestrator | 2026-04-20 02:27:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:37.128250 | orchestrator | 2026-04-20 02:27:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:37.128320 | orchestrator | 2026-04-20 02:27:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:40.175902 | orchestrator | 2026-04-20 02:27:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:40.177496 | orchestrator | 2026-04-20 02:27:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:40.177555 | orchestrator | 2026-04-20 02:27:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:43.224939 | orchestrator | 2026-04-20 02:27:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:43.226611 | orchestrator | 2026-04-20 02:27:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:43.226692 | orchestrator | 2026-04-20 02:27:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:46.277965 | orchestrator | 2026-04-20 02:27:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:46.279333 | orchestrator | 2026-04-20 02:27:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:46.279392 | orchestrator | 2026-04-20 02:27:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:49.326203 | orchestrator | 2026-04-20 02:27:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:49.327619 | orchestrator | 2026-04-20 02:27:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:49.327704 | orchestrator | 2026-04-20 02:27:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:52.377422 | orchestrator | 2026-04-20 02:27:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:52.381398 | orchestrator | 2026-04-20 02:27:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:52.381498 | orchestrator | 2026-04-20 02:27:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:55.430128 | orchestrator | 2026-04-20 02:27:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:55.431791 | orchestrator | 2026-04-20 02:27:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:55.431846 | orchestrator | 2026-04-20 02:27:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:27:58.481353 | orchestrator | 2026-04-20 02:27:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:27:58.483330 | orchestrator | 2026-04-20 02:27:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:27:58.483393 | orchestrator | 2026-04-20 02:27:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:01.531730 | orchestrator | 2026-04-20 02:28:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:01.533010 | orchestrator | 2026-04-20 02:28:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:01.533093 | orchestrator | 2026-04-20 02:28:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:04.578853 | orchestrator | 2026-04-20 02:28:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:04.580776 | orchestrator | 2026-04-20 02:28:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:04.580850 | orchestrator | 2026-04-20 02:28:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:07.627596 | orchestrator | 2026-04-20 02:28:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:07.630089 | orchestrator | 2026-04-20 02:28:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:07.630147 | orchestrator | 2026-04-20 02:28:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:10.677709 | orchestrator | 2026-04-20 02:28:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:10.680108 | orchestrator | 2026-04-20 02:28:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:10.680162 | orchestrator | 2026-04-20 02:28:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:13.735756 | orchestrator | 2026-04-20 02:28:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:13.738621 | orchestrator | 2026-04-20 02:28:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:13.738807 | orchestrator | 2026-04-20 02:28:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:16.791086 | orchestrator | 2026-04-20 02:28:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:16.793619 | orchestrator | 2026-04-20 02:28:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:16.793678 | orchestrator | 2026-04-20 02:28:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:19.848742 | orchestrator | 2026-04-20 02:28:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:19.848862 | orchestrator | 2026-04-20 02:28:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:19.848882 | orchestrator | 2026-04-20 02:28:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:22.901213 | orchestrator | 2026-04-20 02:28:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:22.902758 | orchestrator | 2026-04-20 02:28:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:22.902797 | orchestrator | 2026-04-20 02:28:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:25.953636 | orchestrator | 2026-04-20 02:28:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:25.955006 | orchestrator | 2026-04-20 02:28:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:25.955082 | orchestrator | 2026-04-20 02:28:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:29.004195 | orchestrator | 2026-04-20 02:28:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:29.006213 | orchestrator | 2026-04-20 02:28:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:29.006261 | orchestrator | 2026-04-20 02:28:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:32.063581 | orchestrator | 2026-04-20 02:28:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:32.066498 | orchestrator | 2026-04-20 02:28:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:32.066551 | orchestrator | 2026-04-20 02:28:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:35.127116 | orchestrator | 2026-04-20 02:28:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:35.127811 | orchestrator | 2026-04-20 02:28:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:35.128007 | orchestrator | 2026-04-20 02:28:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:38.188521 | orchestrator | 2026-04-20 02:28:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:38.192005 | orchestrator | 2026-04-20 02:28:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:38.192071 | orchestrator | 2026-04-20 02:28:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:41.244006 | orchestrator | 2026-04-20 02:28:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:41.247467 | orchestrator | 2026-04-20 02:28:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:41.247559 | orchestrator | 2026-04-20 02:28:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:44.298232 | orchestrator | 2026-04-20 02:28:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:44.298782 | orchestrator | 2026-04-20 02:28:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:44.298819 | orchestrator | 2026-04-20 02:28:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:47.350365 | orchestrator | 2026-04-20 02:28:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:47.352667 | orchestrator | 2026-04-20 02:28:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:47.352743 | orchestrator | 2026-04-20 02:28:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:50.408322 | orchestrator | 2026-04-20 02:28:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:50.409319 | orchestrator | 2026-04-20 02:28:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:50.409433 | orchestrator | 2026-04-20 02:28:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:53.451268 | orchestrator | 2026-04-20 02:28:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:53.451794 | orchestrator | 2026-04-20 02:28:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:53.451841 | orchestrator | 2026-04-20 02:28:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:56.496967 | orchestrator | 2026-04-20 02:28:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:56.498839 | orchestrator | 2026-04-20 02:28:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:56.498974 | orchestrator | 2026-04-20 02:28:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:28:59.546924 | orchestrator | 2026-04-20 02:28:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:28:59.548599 | orchestrator | 2026-04-20 02:28:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:28:59.548654 | orchestrator | 2026-04-20 02:28:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:02.601658 | orchestrator | 2026-04-20 02:29:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:02.603373 | orchestrator | 2026-04-20 02:29:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:02.603409 | orchestrator | 2026-04-20 02:29:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:05.658590 | orchestrator | 2026-04-20 02:29:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:05.660029 | orchestrator | 2026-04-20 02:29:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:05.660089 | orchestrator | 2026-04-20 02:29:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:08.710723 | orchestrator | 2026-04-20 02:29:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:08.715333 | orchestrator | 2026-04-20 02:29:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:08.715403 | orchestrator | 2026-04-20 02:29:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:11.759899 | orchestrator | 2026-04-20 02:29:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:11.761495 | orchestrator | 2026-04-20 02:29:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:11.761600 | orchestrator | 2026-04-20 02:29:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:14.815410 | orchestrator | 2026-04-20 02:29:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:14.817080 | orchestrator | 2026-04-20 02:29:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:14.817130 | orchestrator | 2026-04-20 02:29:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:17.865492 | orchestrator | 2026-04-20 02:29:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:17.867121 | orchestrator | 2026-04-20 02:29:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:17.867179 | orchestrator | 2026-04-20 02:29:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:20.910460 | orchestrator | 2026-04-20 02:29:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:20.912037 | orchestrator | 2026-04-20 02:29:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:20.912073 | orchestrator | 2026-04-20 02:29:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:23.956460 | orchestrator | 2026-04-20 02:29:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:23.957329 | orchestrator | 2026-04-20 02:29:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:23.957413 | orchestrator | 2026-04-20 02:29:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:27.006405 | orchestrator | 2026-04-20 02:29:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:27.007428 | orchestrator | 2026-04-20 02:29:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:27.007482 | orchestrator | 2026-04-20 02:29:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:30.046629 | orchestrator | 2026-04-20 02:29:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:30.047900 | orchestrator | 2026-04-20 02:29:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:30.047961 | orchestrator | 2026-04-20 02:29:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:33.096504 | orchestrator | 2026-04-20 02:29:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:33.098814 | orchestrator | 2026-04-20 02:29:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:33.098900 | orchestrator | 2026-04-20 02:29:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:36.143344 | orchestrator | 2026-04-20 02:29:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:36.145133 | orchestrator | 2026-04-20 02:29:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:36.145407 | orchestrator | 2026-04-20 02:29:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:39.200929 | orchestrator | 2026-04-20 02:29:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:39.205580 | orchestrator | 2026-04-20 02:29:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:39.205663 | orchestrator | 2026-04-20 02:29:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:42.255550 | orchestrator | 2026-04-20 02:29:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:42.257707 | orchestrator | 2026-04-20 02:29:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:42.257743 | orchestrator | 2026-04-20 02:29:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:45.315151 | orchestrator | 2026-04-20 02:29:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:45.318375 | orchestrator | 2026-04-20 02:29:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:45.318468 | orchestrator | 2026-04-20 02:29:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:48.370631 | orchestrator | 2026-04-20 02:29:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:48.372789 | orchestrator | 2026-04-20 02:29:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:48.372838 | orchestrator | 2026-04-20 02:29:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:51.426479 | orchestrator | 2026-04-20 02:29:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:51.428041 | orchestrator | 2026-04-20 02:29:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:51.428094 | orchestrator | 2026-04-20 02:29:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:54.475842 | orchestrator | 2026-04-20 02:29:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:54.477108 | orchestrator | 2026-04-20 02:29:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:54.477153 | orchestrator | 2026-04-20 02:29:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:29:57.528943 | orchestrator | 2026-04-20 02:29:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:29:57.530227 | orchestrator | 2026-04-20 02:29:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:29:57.530270 | orchestrator | 2026-04-20 02:29:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:00.577150 | orchestrator | 2026-04-20 02:30:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:00.578350 | orchestrator | 2026-04-20 02:30:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:00.578398 | orchestrator | 2026-04-20 02:30:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:03.625362 | orchestrator | 2026-04-20 02:30:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:03.627273 | orchestrator | 2026-04-20 02:30:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:03.627367 | orchestrator | 2026-04-20 02:30:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:06.674997 | orchestrator | 2026-04-20 02:30:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:06.676621 | orchestrator | 2026-04-20 02:30:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:06.676678 | orchestrator | 2026-04-20 02:30:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:09.728777 | orchestrator | 2026-04-20 02:30:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:09.731140 | orchestrator | 2026-04-20 02:30:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:09.731431 | orchestrator | 2026-04-20 02:30:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:12.779256 | orchestrator | 2026-04-20 02:30:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:12.781059 | orchestrator | 2026-04-20 02:30:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:12.781219 | orchestrator | 2026-04-20 02:30:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:15.830571 | orchestrator | 2026-04-20 02:30:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:15.832693 | orchestrator | 2026-04-20 02:30:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:15.832769 | orchestrator | 2026-04-20 02:30:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:18.880517 | orchestrator | 2026-04-20 02:30:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:18.882186 | orchestrator | 2026-04-20 02:30:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:18.882247 | orchestrator | 2026-04-20 02:30:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:21.926238 | orchestrator | 2026-04-20 02:30:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:21.927937 | orchestrator | 2026-04-20 02:30:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:21.928010 | orchestrator | 2026-04-20 02:30:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:24.974857 | orchestrator | 2026-04-20 02:30:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:24.976445 | orchestrator | 2026-04-20 02:30:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:24.976526 | orchestrator | 2026-04-20 02:30:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:28.024994 | orchestrator | 2026-04-20 02:30:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:28.026866 | orchestrator | 2026-04-20 02:30:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:28.026944 | orchestrator | 2026-04-20 02:30:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:31.067474 | orchestrator | 2026-04-20 02:30:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:31.071241 | orchestrator | 2026-04-20 02:30:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:31.071341 | orchestrator | 2026-04-20 02:30:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:34.121341 | orchestrator | 2026-04-20 02:30:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:34.122965 | orchestrator | 2026-04-20 02:30:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:34.123603 | orchestrator | 2026-04-20 02:30:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:37.176855 | orchestrator | 2026-04-20 02:30:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:37.178228 | orchestrator | 2026-04-20 02:30:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:37.178281 | orchestrator | 2026-04-20 02:30:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:40.231430 | orchestrator | 2026-04-20 02:30:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:40.233107 | orchestrator | 2026-04-20 02:30:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:40.233151 | orchestrator | 2026-04-20 02:30:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:43.279650 | orchestrator | 2026-04-20 02:30:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:43.281847 | orchestrator | 2026-04-20 02:30:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:43.282011 | orchestrator | 2026-04-20 02:30:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:46.327019 | orchestrator | 2026-04-20 02:30:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:46.327930 | orchestrator | 2026-04-20 02:30:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:46.328862 | orchestrator | 2026-04-20 02:30:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:49.375497 | orchestrator | 2026-04-20 02:30:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:49.377140 | orchestrator | 2026-04-20 02:30:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:49.377178 | orchestrator | 2026-04-20 02:30:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:52.422798 | orchestrator | 2026-04-20 02:30:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:52.424575 | orchestrator | 2026-04-20 02:30:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:52.424670 | orchestrator | 2026-04-20 02:30:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:55.476839 | orchestrator | 2026-04-20 02:30:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:55.479959 | orchestrator | 2026-04-20 02:30:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:55.480010 | orchestrator | 2026-04-20 02:30:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:30:58.524842 | orchestrator | 2026-04-20 02:30:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:30:58.525107 | orchestrator | 2026-04-20 02:30:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:30:58.525173 | orchestrator | 2026-04-20 02:30:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:01.569318 | orchestrator | 2026-04-20 02:31:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:01.570940 | orchestrator | 2026-04-20 02:31:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:01.571002 | orchestrator | 2026-04-20 02:31:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:04.619895 | orchestrator | 2026-04-20 02:31:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:04.621926 | orchestrator | 2026-04-20 02:31:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:04.622115 | orchestrator | 2026-04-20 02:31:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:07.668130 | orchestrator | 2026-04-20 02:31:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:07.670659 | orchestrator | 2026-04-20 02:31:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:07.670707 | orchestrator | 2026-04-20 02:31:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:10.725528 | orchestrator | 2026-04-20 02:31:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:10.727110 | orchestrator | 2026-04-20 02:31:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:10.727135 | orchestrator | 2026-04-20 02:31:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:13.777564 | orchestrator | 2026-04-20 02:31:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:13.780409 | orchestrator | 2026-04-20 02:31:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:13.780457 | orchestrator | 2026-04-20 02:31:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:16.828540 | orchestrator | 2026-04-20 02:31:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:16.830289 | orchestrator | 2026-04-20 02:31:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:16.830451 | orchestrator | 2026-04-20 02:31:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:19.871341 | orchestrator | 2026-04-20 02:31:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:19.873348 | orchestrator | 2026-04-20 02:31:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:19.873411 | orchestrator | 2026-04-20 02:31:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:22.917612 | orchestrator | 2026-04-20 02:31:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:22.918991 | orchestrator | 2026-04-20 02:31:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:22.919044 | orchestrator | 2026-04-20 02:31:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:25.966727 | orchestrator | 2026-04-20 02:31:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:25.970207 | orchestrator | 2026-04-20 02:31:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:25.970273 | orchestrator | 2026-04-20 02:31:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:29.022477 | orchestrator | 2026-04-20 02:31:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:29.024204 | orchestrator | 2026-04-20 02:31:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:29.024407 | orchestrator | 2026-04-20 02:31:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:32.072852 | orchestrator | 2026-04-20 02:31:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:32.076286 | orchestrator | 2026-04-20 02:31:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:32.076354 | orchestrator | 2026-04-20 02:31:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:35.126774 | orchestrator | 2026-04-20 02:31:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:35.128653 | orchestrator | 2026-04-20 02:31:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:35.128711 | orchestrator | 2026-04-20 02:31:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:38.181289 | orchestrator | 2026-04-20 02:31:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:38.182683 | orchestrator | 2026-04-20 02:31:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:38.182723 | orchestrator | 2026-04-20 02:31:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:41.233642 | orchestrator | 2026-04-20 02:31:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:41.235056 | orchestrator | 2026-04-20 02:31:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:41.235102 | orchestrator | 2026-04-20 02:31:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:44.283305 | orchestrator | 2026-04-20 02:31:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:44.285723 | orchestrator | 2026-04-20 02:31:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:44.285808 | orchestrator | 2026-04-20 02:31:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:47.333505 | orchestrator | 2026-04-20 02:31:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:47.334116 | orchestrator | 2026-04-20 02:31:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:47.334158 | orchestrator | 2026-04-20 02:31:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:50.376497 | orchestrator | 2026-04-20 02:31:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:50.379810 | orchestrator | 2026-04-20 02:31:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:50.379907 | orchestrator | 2026-04-20 02:31:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:53.429670 | orchestrator | 2026-04-20 02:31:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:53.431635 | orchestrator | 2026-04-20 02:31:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:53.431699 | orchestrator | 2026-04-20 02:31:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:56.486269 | orchestrator | 2026-04-20 02:31:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:56.488281 | orchestrator | 2026-04-20 02:31:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:56.488321 | orchestrator | 2026-04-20 02:31:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:31:59.540601 | orchestrator | 2026-04-20 02:31:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:31:59.542076 | orchestrator | 2026-04-20 02:31:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:31:59.542175 | orchestrator | 2026-04-20 02:31:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:02.592367 | orchestrator | 2026-04-20 02:32:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:02.594151 | orchestrator | 2026-04-20 02:32:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:02.594243 | orchestrator | 2026-04-20 02:32:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:05.641626 | orchestrator | 2026-04-20 02:32:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:05.642875 | orchestrator | 2026-04-20 02:32:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:05.642991 | orchestrator | 2026-04-20 02:32:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:08.691517 | orchestrator | 2026-04-20 02:32:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:08.693693 | orchestrator | 2026-04-20 02:32:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:08.693747 | orchestrator | 2026-04-20 02:32:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:11.744814 | orchestrator | 2026-04-20 02:32:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:11.747060 | orchestrator | 2026-04-20 02:32:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:11.747137 | orchestrator | 2026-04-20 02:32:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:14.785246 | orchestrator | 2026-04-20 02:32:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:14.786264 | orchestrator | 2026-04-20 02:32:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:14.786359 | orchestrator | 2026-04-20 02:32:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:17.830701 | orchestrator | 2026-04-20 02:32:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:17.831756 | orchestrator | 2026-04-20 02:32:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:17.831817 | orchestrator | 2026-04-20 02:32:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:20.875220 | orchestrator | 2026-04-20 02:32:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:20.877901 | orchestrator | 2026-04-20 02:32:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:20.877939 | orchestrator | 2026-04-20 02:32:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:23.920322 | orchestrator | 2026-04-20 02:32:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:23.922065 | orchestrator | 2026-04-20 02:32:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:23.922121 | orchestrator | 2026-04-20 02:32:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:26.968536 | orchestrator | 2026-04-20 02:32:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:26.970308 | orchestrator | 2026-04-20 02:32:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:26.970380 | orchestrator | 2026-04-20 02:32:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:30.018413 | orchestrator | 2026-04-20 02:32:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:30.019060 | orchestrator | 2026-04-20 02:32:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:30.019388 | orchestrator | 2026-04-20 02:32:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:33.065605 | orchestrator | 2026-04-20 02:32:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:33.068684 | orchestrator | 2026-04-20 02:32:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:33.068748 | orchestrator | 2026-04-20 02:32:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:36.110315 | orchestrator | 2026-04-20 02:32:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:36.111557 | orchestrator | 2026-04-20 02:32:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:36.111606 | orchestrator | 2026-04-20 02:32:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:39.162009 | orchestrator | 2026-04-20 02:32:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:39.162973 | orchestrator | 2026-04-20 02:32:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:39.163165 | orchestrator | 2026-04-20 02:32:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:42.217937 | orchestrator | 2026-04-20 02:32:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:42.220134 | orchestrator | 2026-04-20 02:32:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:42.220189 | orchestrator | 2026-04-20 02:32:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:45.275097 | orchestrator | 2026-04-20 02:32:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:45.276448 | orchestrator | 2026-04-20 02:32:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:45.276533 | orchestrator | 2026-04-20 02:32:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:48.325783 | orchestrator | 2026-04-20 02:32:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:48.328893 | orchestrator | 2026-04-20 02:32:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:48.328951 | orchestrator | 2026-04-20 02:32:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:51.378819 | orchestrator | 2026-04-20 02:32:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:51.380961 | orchestrator | 2026-04-20 02:32:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:51.381048 | orchestrator | 2026-04-20 02:32:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:54.432213 | orchestrator | 2026-04-20 02:32:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:54.433933 | orchestrator | 2026-04-20 02:32:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:54.434161 | orchestrator | 2026-04-20 02:32:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:32:57.481683 | orchestrator | 2026-04-20 02:32:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:32:57.483283 | orchestrator | 2026-04-20 02:32:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:32:57.485110 | orchestrator | 2026-04-20 02:32:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:00.529159 | orchestrator | 2026-04-20 02:33:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:00.531082 | orchestrator | 2026-04-20 02:33:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:00.531160 | orchestrator | 2026-04-20 02:33:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:03.582320 | orchestrator | 2026-04-20 02:33:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:03.583245 | orchestrator | 2026-04-20 02:33:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:03.583272 | orchestrator | 2026-04-20 02:33:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:06.626644 | orchestrator | 2026-04-20 02:33:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:06.628423 | orchestrator | 2026-04-20 02:33:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:06.628596 | orchestrator | 2026-04-20 02:33:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:09.682290 | orchestrator | 2026-04-20 02:33:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:09.683902 | orchestrator | 2026-04-20 02:33:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:09.683962 | orchestrator | 2026-04-20 02:33:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:12.730360 | orchestrator | 2026-04-20 02:33:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:12.732273 | orchestrator | 2026-04-20 02:33:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:12.732364 | orchestrator | 2026-04-20 02:33:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:15.783481 | orchestrator | 2026-04-20 02:33:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:15.784913 | orchestrator | 2026-04-20 02:33:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:15.784965 | orchestrator | 2026-04-20 02:33:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:18.838046 | orchestrator | 2026-04-20 02:33:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:18.839317 | orchestrator | 2026-04-20 02:33:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:18.839348 | orchestrator | 2026-04-20 02:33:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:21.885124 | orchestrator | 2026-04-20 02:33:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:21.887202 | orchestrator | 2026-04-20 02:33:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:21.887247 | orchestrator | 2026-04-20 02:33:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:24.938668 | orchestrator | 2026-04-20 02:33:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:24.940588 | orchestrator | 2026-04-20 02:33:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:24.940640 | orchestrator | 2026-04-20 02:33:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:27.995136 | orchestrator | 2026-04-20 02:33:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:27.997223 | orchestrator | 2026-04-20 02:33:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:27.997316 | orchestrator | 2026-04-20 02:33:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:31.045733 | orchestrator | 2026-04-20 02:33:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:31.047710 | orchestrator | 2026-04-20 02:33:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:31.047785 | orchestrator | 2026-04-20 02:33:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:34.100343 | orchestrator | 2026-04-20 02:33:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:34.100794 | orchestrator | 2026-04-20 02:33:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:34.100827 | orchestrator | 2026-04-20 02:33:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:37.152791 | orchestrator | 2026-04-20 02:33:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:37.154716 | orchestrator | 2026-04-20 02:33:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:37.154777 | orchestrator | 2026-04-20 02:33:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:40.200200 | orchestrator | 2026-04-20 02:33:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:40.202270 | orchestrator | 2026-04-20 02:33:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:40.202625 | orchestrator | 2026-04-20 02:33:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:43.256149 | orchestrator | 2026-04-20 02:33:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:43.258085 | orchestrator | 2026-04-20 02:33:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:43.258166 | orchestrator | 2026-04-20 02:33:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:46.301997 | orchestrator | 2026-04-20 02:33:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:46.304469 | orchestrator | 2026-04-20 02:33:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:46.304612 | orchestrator | 2026-04-20 02:33:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:49.354547 | orchestrator | 2026-04-20 02:33:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:49.356707 | orchestrator | 2026-04-20 02:33:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:49.356774 | orchestrator | 2026-04-20 02:33:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:52.406528 | orchestrator | 2026-04-20 02:33:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:52.407970 | orchestrator | 2026-04-20 02:33:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:52.408071 | orchestrator | 2026-04-20 02:33:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:55.461764 | orchestrator | 2026-04-20 02:33:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:55.463772 | orchestrator | 2026-04-20 02:33:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:55.463823 | orchestrator | 2026-04-20 02:33:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:33:58.516829 | orchestrator | 2026-04-20 02:33:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:33:58.517978 | orchestrator | 2026-04-20 02:33:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:33:58.518321 | orchestrator | 2026-04-20 02:33:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:01.562493 | orchestrator | 2026-04-20 02:34:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:01.563691 | orchestrator | 2026-04-20 02:34:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:01.563728 | orchestrator | 2026-04-20 02:34:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:04.614377 | orchestrator | 2026-04-20 02:34:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:04.615636 | orchestrator | 2026-04-20 02:34:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:04.615925 | orchestrator | 2026-04-20 02:34:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:07.663978 | orchestrator | 2026-04-20 02:34:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:07.666435 | orchestrator | 2026-04-20 02:34:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:07.666859 | orchestrator | 2026-04-20 02:34:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:10.727786 | orchestrator | 2026-04-20 02:34:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:10.728512 | orchestrator | 2026-04-20 02:34:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:10.728581 | orchestrator | 2026-04-20 02:34:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:13.781193 | orchestrator | 2026-04-20 02:34:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:13.783390 | orchestrator | 2026-04-20 02:34:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:13.783469 | orchestrator | 2026-04-20 02:34:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:16.832704 | orchestrator | 2026-04-20 02:34:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:16.835385 | orchestrator | 2026-04-20 02:34:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:16.835420 | orchestrator | 2026-04-20 02:34:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:19.884348 | orchestrator | 2026-04-20 02:34:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:19.886245 | orchestrator | 2026-04-20 02:34:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:19.886316 | orchestrator | 2026-04-20 02:34:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:22.939785 | orchestrator | 2026-04-20 02:34:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:22.941874 | orchestrator | 2026-04-20 02:34:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:22.941927 | orchestrator | 2026-04-20 02:34:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:25.992932 | orchestrator | 2026-04-20 02:34:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:25.995430 | orchestrator | 2026-04-20 02:34:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:25.995535 | orchestrator | 2026-04-20 02:34:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:29.045006 | orchestrator | 2026-04-20 02:34:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:29.047480 | orchestrator | 2026-04-20 02:34:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:29.047655 | orchestrator | 2026-04-20 02:34:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:32.093511 | orchestrator | 2026-04-20 02:34:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:32.094452 | orchestrator | 2026-04-20 02:34:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:32.094490 | orchestrator | 2026-04-20 02:34:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:35.140433 | orchestrator | 2026-04-20 02:34:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:35.141608 | orchestrator | 2026-04-20 02:34:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:35.141657 | orchestrator | 2026-04-20 02:34:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:38.192356 | orchestrator | 2026-04-20 02:34:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:38.192934 | orchestrator | 2026-04-20 02:34:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:38.192985 | orchestrator | 2026-04-20 02:34:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:41.245105 | orchestrator | 2026-04-20 02:34:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:41.247654 | orchestrator | 2026-04-20 02:34:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:41.247694 | orchestrator | 2026-04-20 02:34:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:44.301172 | orchestrator | 2026-04-20 02:34:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:44.303756 | orchestrator | 2026-04-20 02:34:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:44.303788 | orchestrator | 2026-04-20 02:34:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:47.346899 | orchestrator | 2026-04-20 02:34:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:47.348355 | orchestrator | 2026-04-20 02:34:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:47.348558 | orchestrator | 2026-04-20 02:34:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:50.402552 | orchestrator | 2026-04-20 02:34:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:50.405271 | orchestrator | 2026-04-20 02:34:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:50.405349 | orchestrator | 2026-04-20 02:34:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:53.450438 | orchestrator | 2026-04-20 02:34:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:53.452365 | orchestrator | 2026-04-20 02:34:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:53.452808 | orchestrator | 2026-04-20 02:34:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:56.497305 | orchestrator | 2026-04-20 02:34:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:56.498521 | orchestrator | 2026-04-20 02:34:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:56.498567 | orchestrator | 2026-04-20 02:34:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:34:59.551882 | orchestrator | 2026-04-20 02:34:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:34:59.554532 | orchestrator | 2026-04-20 02:34:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:34:59.554642 | orchestrator | 2026-04-20 02:34:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:02.599816 | orchestrator | 2026-04-20 02:35:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:02.601838 | orchestrator | 2026-04-20 02:35:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:02.601902 | orchestrator | 2026-04-20 02:35:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:05.651200 | orchestrator | 2026-04-20 02:35:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:05.653344 | orchestrator | 2026-04-20 02:35:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:05.653391 | orchestrator | 2026-04-20 02:35:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:08.709094 | orchestrator | 2026-04-20 02:35:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:08.711955 | orchestrator | 2026-04-20 02:35:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:08.712033 | orchestrator | 2026-04-20 02:35:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:11.762314 | orchestrator | 2026-04-20 02:35:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:11.762451 | orchestrator | 2026-04-20 02:35:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:11.762464 | orchestrator | 2026-04-20 02:35:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:14.813025 | orchestrator | 2026-04-20 02:35:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:14.816061 | orchestrator | 2026-04-20 02:35:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:14.816129 | orchestrator | 2026-04-20 02:35:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:17.862547 | orchestrator | 2026-04-20 02:35:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:17.863439 | orchestrator | 2026-04-20 02:35:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:17.863508 | orchestrator | 2026-04-20 02:35:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:20.905289 | orchestrator | 2026-04-20 02:35:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:20.906993 | orchestrator | 2026-04-20 02:35:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:20.907139 | orchestrator | 2026-04-20 02:35:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:23.952785 | orchestrator | 2026-04-20 02:35:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:23.954421 | orchestrator | 2026-04-20 02:35:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:23.954723 | orchestrator | 2026-04-20 02:35:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:27.005997 | orchestrator | 2026-04-20 02:35:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:27.009262 | orchestrator | 2026-04-20 02:35:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:27.009552 | orchestrator | 2026-04-20 02:35:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:30.061956 | orchestrator | 2026-04-20 02:35:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:30.063502 | orchestrator | 2026-04-20 02:35:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:30.063722 | orchestrator | 2026-04-20 02:35:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:33.111693 | orchestrator | 2026-04-20 02:35:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:33.114741 | orchestrator | 2026-04-20 02:35:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:33.114775 | orchestrator | 2026-04-20 02:35:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:36.164339 | orchestrator | 2026-04-20 02:35:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:36.167289 | orchestrator | 2026-04-20 02:35:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:36.167386 | orchestrator | 2026-04-20 02:35:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:39.220205 | orchestrator | 2026-04-20 02:35:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:39.221484 | orchestrator | 2026-04-20 02:35:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:39.221744 | orchestrator | 2026-04-20 02:35:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:42.267849 | orchestrator | 2026-04-20 02:35:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:42.269067 | orchestrator | 2026-04-20 02:35:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:42.269121 | orchestrator | 2026-04-20 02:35:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:45.320636 | orchestrator | 2026-04-20 02:35:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:45.322525 | orchestrator | 2026-04-20 02:35:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:45.322568 | orchestrator | 2026-04-20 02:35:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:48.369120 | orchestrator | 2026-04-20 02:35:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:48.370484 | orchestrator | 2026-04-20 02:35:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:48.370545 | orchestrator | 2026-04-20 02:35:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:51.423313 | orchestrator | 2026-04-20 02:35:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:51.426078 | orchestrator | 2026-04-20 02:35:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:51.426159 | orchestrator | 2026-04-20 02:35:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:54.474073 | orchestrator | 2026-04-20 02:35:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:54.475990 | orchestrator | 2026-04-20 02:35:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:54.476099 | orchestrator | 2026-04-20 02:35:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:35:57.525863 | orchestrator | 2026-04-20 02:35:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:35:57.526482 | orchestrator | 2026-04-20 02:35:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:35:57.526521 | orchestrator | 2026-04-20 02:35:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:00.572139 | orchestrator | 2026-04-20 02:36:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:00.573458 | orchestrator | 2026-04-20 02:36:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:00.573499 | orchestrator | 2026-04-20 02:36:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:03.620995 | orchestrator | 2026-04-20 02:36:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:03.622867 | orchestrator | 2026-04-20 02:36:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:03.622927 | orchestrator | 2026-04-20 02:36:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:06.673868 | orchestrator | 2026-04-20 02:36:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:06.676335 | orchestrator | 2026-04-20 02:36:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:06.676391 | orchestrator | 2026-04-20 02:36:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:09.726125 | orchestrator | 2026-04-20 02:36:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:09.729322 | orchestrator | 2026-04-20 02:36:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:09.729433 | orchestrator | 2026-04-20 02:36:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:12.779372 | orchestrator | 2026-04-20 02:36:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:12.780426 | orchestrator | 2026-04-20 02:36:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:12.780470 | orchestrator | 2026-04-20 02:36:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:15.831613 | orchestrator | 2026-04-20 02:36:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:15.833493 | orchestrator | 2026-04-20 02:36:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:15.833538 | orchestrator | 2026-04-20 02:36:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:18.873382 | orchestrator | 2026-04-20 02:36:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:18.875662 | orchestrator | 2026-04-20 02:36:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:18.875806 | orchestrator | 2026-04-20 02:36:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:21.923276 | orchestrator | 2026-04-20 02:36:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:21.924626 | orchestrator | 2026-04-20 02:36:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:21.924669 | orchestrator | 2026-04-20 02:36:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:24.971052 | orchestrator | 2026-04-20 02:36:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:24.972346 | orchestrator | 2026-04-20 02:36:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:24.972441 | orchestrator | 2026-04-20 02:36:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:28.018003 | orchestrator | 2026-04-20 02:36:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:28.019147 | orchestrator | 2026-04-20 02:36:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:28.019199 | orchestrator | 2026-04-20 02:36:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:31.069610 | orchestrator | 2026-04-20 02:36:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:31.071900 | orchestrator | 2026-04-20 02:36:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:31.072013 | orchestrator | 2026-04-20 02:36:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:34.119743 | orchestrator | 2026-04-20 02:36:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:34.121134 | orchestrator | 2026-04-20 02:36:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:34.121200 | orchestrator | 2026-04-20 02:36:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:37.168247 | orchestrator | 2026-04-20 02:36:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:37.169764 | orchestrator | 2026-04-20 02:36:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:37.169798 | orchestrator | 2026-04-20 02:36:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:40.224411 | orchestrator | 2026-04-20 02:36:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:40.226369 | orchestrator | 2026-04-20 02:36:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:40.226413 | orchestrator | 2026-04-20 02:36:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:43.277909 | orchestrator | 2026-04-20 02:36:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:43.280769 | orchestrator | 2026-04-20 02:36:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:43.280870 | orchestrator | 2026-04-20 02:36:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:46.331836 | orchestrator | 2026-04-20 02:36:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:46.333929 | orchestrator | 2026-04-20 02:36:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:46.333978 | orchestrator | 2026-04-20 02:36:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:49.381233 | orchestrator | 2026-04-20 02:36:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:49.382665 | orchestrator | 2026-04-20 02:36:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:49.382782 | orchestrator | 2026-04-20 02:36:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:52.424788 | orchestrator | 2026-04-20 02:36:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:52.426876 | orchestrator | 2026-04-20 02:36:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:52.427009 | orchestrator | 2026-04-20 02:36:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:55.468876 | orchestrator | 2026-04-20 02:36:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:55.470434 | orchestrator | 2026-04-20 02:36:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:55.470485 | orchestrator | 2026-04-20 02:36:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:36:58.517387 | orchestrator | 2026-04-20 02:36:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:36:58.520746 | orchestrator | 2026-04-20 02:36:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:36:58.520855 | orchestrator | 2026-04-20 02:36:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:01.572942 | orchestrator | 2026-04-20 02:37:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:01.574234 | orchestrator | 2026-04-20 02:37:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:01.574294 | orchestrator | 2026-04-20 02:37:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:04.619980 | orchestrator | 2026-04-20 02:37:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:04.621974 | orchestrator | 2026-04-20 02:37:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:04.622129 | orchestrator | 2026-04-20 02:37:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:07.665640 | orchestrator | 2026-04-20 02:37:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:07.668637 | orchestrator | 2026-04-20 02:37:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:07.668689 | orchestrator | 2026-04-20 02:37:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:10.721266 | orchestrator | 2026-04-20 02:37:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:10.725348 | orchestrator | 2026-04-20 02:37:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:10.725423 | orchestrator | 2026-04-20 02:37:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:13.777515 | orchestrator | 2026-04-20 02:37:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:13.781432 | orchestrator | 2026-04-20 02:37:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:13.781487 | orchestrator | 2026-04-20 02:37:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:16.833969 | orchestrator | 2026-04-20 02:37:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:16.836989 | orchestrator | 2026-04-20 02:37:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:16.837054 | orchestrator | 2026-04-20 02:37:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:19.886925 | orchestrator | 2026-04-20 02:37:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:19.889728 | orchestrator | 2026-04-20 02:37:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:19.889822 | orchestrator | 2026-04-20 02:37:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:22.936192 | orchestrator | 2026-04-20 02:37:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:22.938617 | orchestrator | 2026-04-20 02:37:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:22.938718 | orchestrator | 2026-04-20 02:37:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:25.983301 | orchestrator | 2026-04-20 02:37:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:25.985878 | orchestrator | 2026-04-20 02:37:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:25.985911 | orchestrator | 2026-04-20 02:37:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:29.028185 | orchestrator | 2026-04-20 02:37:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:29.028430 | orchestrator | 2026-04-20 02:37:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:29.028472 | orchestrator | 2026-04-20 02:37:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:32.065488 | orchestrator | 2026-04-20 02:37:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:32.067070 | orchestrator | 2026-04-20 02:37:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:32.067117 | orchestrator | 2026-04-20 02:37:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:35.110561 | orchestrator | 2026-04-20 02:37:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:35.112272 | orchestrator | 2026-04-20 02:37:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:35.112328 | orchestrator | 2026-04-20 02:37:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:38.157254 | orchestrator | 2026-04-20 02:37:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:38.159236 | orchestrator | 2026-04-20 02:37:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:38.159517 | orchestrator | 2026-04-20 02:37:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:41.207740 | orchestrator | 2026-04-20 02:37:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:41.210004 | orchestrator | 2026-04-20 02:37:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:41.210156 | orchestrator | 2026-04-20 02:37:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:44.258267 | orchestrator | 2026-04-20 02:37:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:44.260232 | orchestrator | 2026-04-20 02:37:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:44.260300 | orchestrator | 2026-04-20 02:37:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:47.303379 | orchestrator | 2026-04-20 02:37:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:47.304573 | orchestrator | 2026-04-20 02:37:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:47.304663 | orchestrator | 2026-04-20 02:37:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:50.347234 | orchestrator | 2026-04-20 02:37:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:50.348632 | orchestrator | 2026-04-20 02:37:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:50.349749 | orchestrator | 2026-04-20 02:37:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:53.393556 | orchestrator | 2026-04-20 02:37:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:53.395636 | orchestrator | 2026-04-20 02:37:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:53.395719 | orchestrator | 2026-04-20 02:37:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:56.436946 | orchestrator | 2026-04-20 02:37:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:56.438594 | orchestrator | 2026-04-20 02:37:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:56.438677 | orchestrator | 2026-04-20 02:37:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:37:59.485032 | orchestrator | 2026-04-20 02:37:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:37:59.486427 | orchestrator | 2026-04-20 02:37:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:37:59.486477 | orchestrator | 2026-04-20 02:37:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:02.541100 | orchestrator | 2026-04-20 02:38:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:02.542612 | orchestrator | 2026-04-20 02:38:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:02.542672 | orchestrator | 2026-04-20 02:38:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:05.594542 | orchestrator | 2026-04-20 02:38:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:05.595963 | orchestrator | 2026-04-20 02:38:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:05.596014 | orchestrator | 2026-04-20 02:38:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:08.642954 | orchestrator | 2026-04-20 02:38:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:08.643334 | orchestrator | 2026-04-20 02:38:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:08.643369 | orchestrator | 2026-04-20 02:38:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:11.697002 | orchestrator | 2026-04-20 02:38:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:11.698758 | orchestrator | 2026-04-20 02:38:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:11.698863 | orchestrator | 2026-04-20 02:38:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:14.751193 | orchestrator | 2026-04-20 02:38:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:14.754509 | orchestrator | 2026-04-20 02:38:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:14.754579 | orchestrator | 2026-04-20 02:38:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:17.800458 | orchestrator | 2026-04-20 02:38:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:17.801557 | orchestrator | 2026-04-20 02:38:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:17.801857 | orchestrator | 2026-04-20 02:38:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:20.851093 | orchestrator | 2026-04-20 02:38:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:20.852530 | orchestrator | 2026-04-20 02:38:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:20.852576 | orchestrator | 2026-04-20 02:38:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:23.900907 | orchestrator | 2026-04-20 02:38:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:23.901830 | orchestrator | 2026-04-20 02:38:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:23.901882 | orchestrator | 2026-04-20 02:38:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:26.953711 | orchestrator | 2026-04-20 02:38:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:26.955027 | orchestrator | 2026-04-20 02:38:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:26.955058 | orchestrator | 2026-04-20 02:38:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:30.015916 | orchestrator | 2026-04-20 02:38:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:30.016010 | orchestrator | 2026-04-20 02:38:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:30.016076 | orchestrator | 2026-04-20 02:38:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:33.061321 | orchestrator | 2026-04-20 02:38:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:33.064044 | orchestrator | 2026-04-20 02:38:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:33.064117 | orchestrator | 2026-04-20 02:38:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:36.112533 | orchestrator | 2026-04-20 02:38:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:36.114206 | orchestrator | 2026-04-20 02:38:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:36.114288 | orchestrator | 2026-04-20 02:38:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:39.160393 | orchestrator | 2026-04-20 02:38:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:39.162452 | orchestrator | 2026-04-20 02:38:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:39.162509 | orchestrator | 2026-04-20 02:38:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:42.211656 | orchestrator | 2026-04-20 02:38:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:42.213644 | orchestrator | 2026-04-20 02:38:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:42.213753 | orchestrator | 2026-04-20 02:38:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:45.261436 | orchestrator | 2026-04-20 02:38:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:45.263611 | orchestrator | 2026-04-20 02:38:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:45.263655 | orchestrator | 2026-04-20 02:38:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:48.313406 | orchestrator | 2026-04-20 02:38:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:48.315591 | orchestrator | 2026-04-20 02:38:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:48.315697 | orchestrator | 2026-04-20 02:38:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:51.365148 | orchestrator | 2026-04-20 02:38:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:51.367245 | orchestrator | 2026-04-20 02:38:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:51.367356 | orchestrator | 2026-04-20 02:38:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:54.412233 | orchestrator | 2026-04-20 02:38:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:54.414806 | orchestrator | 2026-04-20 02:38:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:54.414949 | orchestrator | 2026-04-20 02:38:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:38:57.455912 | orchestrator | 2026-04-20 02:38:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:38:57.456766 | orchestrator | 2026-04-20 02:38:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:38:57.456807 | orchestrator | 2026-04-20 02:38:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:00.508208 | orchestrator | 2026-04-20 02:39:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:00.510658 | orchestrator | 2026-04-20 02:39:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:00.510731 | orchestrator | 2026-04-20 02:39:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:03.564954 | orchestrator | 2026-04-20 02:39:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:03.567215 | orchestrator | 2026-04-20 02:39:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:03.567278 | orchestrator | 2026-04-20 02:39:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:06.609292 | orchestrator | 2026-04-20 02:39:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:06.611661 | orchestrator | 2026-04-20 02:39:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:06.611751 | orchestrator | 2026-04-20 02:39:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:09.659316 | orchestrator | 2026-04-20 02:39:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:09.661323 | orchestrator | 2026-04-20 02:39:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:09.661397 | orchestrator | 2026-04-20 02:39:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:12.711329 | orchestrator | 2026-04-20 02:39:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:12.712863 | orchestrator | 2026-04-20 02:39:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:12.712917 | orchestrator | 2026-04-20 02:39:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:15.761813 | orchestrator | 2026-04-20 02:39:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:15.763636 | orchestrator | 2026-04-20 02:39:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:15.763690 | orchestrator | 2026-04-20 02:39:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:18.820373 | orchestrator | 2026-04-20 02:39:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:18.822482 | orchestrator | 2026-04-20 02:39:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:18.822535 | orchestrator | 2026-04-20 02:39:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:21.865147 | orchestrator | 2026-04-20 02:39:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:21.866379 | orchestrator | 2026-04-20 02:39:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:21.866438 | orchestrator | 2026-04-20 02:39:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:24.919267 | orchestrator | 2026-04-20 02:39:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:24.922361 | orchestrator | 2026-04-20 02:39:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:24.922444 | orchestrator | 2026-04-20 02:39:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:27.972659 | orchestrator | 2026-04-20 02:39:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:27.976314 | orchestrator | 2026-04-20 02:39:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:27.976408 | orchestrator | 2026-04-20 02:39:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:31.036051 | orchestrator | 2026-04-20 02:39:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:31.038641 | orchestrator | 2026-04-20 02:39:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:31.038815 | orchestrator | 2026-04-20 02:39:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:34.080285 | orchestrator | 2026-04-20 02:39:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:34.082413 | orchestrator | 2026-04-20 02:39:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:34.082473 | orchestrator | 2026-04-20 02:39:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:37.131050 | orchestrator | 2026-04-20 02:39:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:37.133734 | orchestrator | 2026-04-20 02:39:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:37.133790 | orchestrator | 2026-04-20 02:39:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:40.184492 | orchestrator | 2026-04-20 02:39:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:40.186452 | orchestrator | 2026-04-20 02:39:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:40.186519 | orchestrator | 2026-04-20 02:39:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:43.235369 | orchestrator | 2026-04-20 02:39:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:43.236993 | orchestrator | 2026-04-20 02:39:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:43.237041 | orchestrator | 2026-04-20 02:39:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:46.290056 | orchestrator | 2026-04-20 02:39:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:46.290527 | orchestrator | 2026-04-20 02:39:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:46.290560 | orchestrator | 2026-04-20 02:39:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:49.344134 | orchestrator | 2026-04-20 02:39:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:49.346963 | orchestrator | 2026-04-20 02:39:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:49.347067 | orchestrator | 2026-04-20 02:39:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:52.391209 | orchestrator | 2026-04-20 02:39:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:52.393389 | orchestrator | 2026-04-20 02:39:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:52.393430 | orchestrator | 2026-04-20 02:39:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:55.435502 | orchestrator | 2026-04-20 02:39:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:55.437102 | orchestrator | 2026-04-20 02:39:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:55.437156 | orchestrator | 2026-04-20 02:39:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:39:58.484988 | orchestrator | 2026-04-20 02:39:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:39:58.486389 | orchestrator | 2026-04-20 02:39:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:39:58.486465 | orchestrator | 2026-04-20 02:39:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:01.532300 | orchestrator | 2026-04-20 02:40:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:01.533259 | orchestrator | 2026-04-20 02:40:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:01.533296 | orchestrator | 2026-04-20 02:40:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:04.582598 | orchestrator | 2026-04-20 02:40:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:04.584455 | orchestrator | 2026-04-20 02:40:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:04.584495 | orchestrator | 2026-04-20 02:40:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:07.626735 | orchestrator | 2026-04-20 02:40:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:07.628809 | orchestrator | 2026-04-20 02:40:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:07.628880 | orchestrator | 2026-04-20 02:40:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:10.675348 | orchestrator | 2026-04-20 02:40:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:10.677425 | orchestrator | 2026-04-20 02:40:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:10.677523 | orchestrator | 2026-04-20 02:40:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:13.726930 | orchestrator | 2026-04-20 02:40:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:13.729244 | orchestrator | 2026-04-20 02:40:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:13.729318 | orchestrator | 2026-04-20 02:40:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:16.776297 | orchestrator | 2026-04-20 02:40:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:16.779924 | orchestrator | 2026-04-20 02:40:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:16.779996 | orchestrator | 2026-04-20 02:40:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:19.835265 | orchestrator | 2026-04-20 02:40:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:19.840228 | orchestrator | 2026-04-20 02:40:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:19.840341 | orchestrator | 2026-04-20 02:40:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:22.890614 | orchestrator | 2026-04-20 02:40:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:22.892698 | orchestrator | 2026-04-20 02:40:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:22.892836 | orchestrator | 2026-04-20 02:40:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:25.943976 | orchestrator | 2026-04-20 02:40:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:25.945137 | orchestrator | 2026-04-20 02:40:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:25.945194 | orchestrator | 2026-04-20 02:40:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:28.993620 | orchestrator | 2026-04-20 02:40:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:28.995390 | orchestrator | 2026-04-20 02:40:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:28.995451 | orchestrator | 2026-04-20 02:40:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:32.053305 | orchestrator | 2026-04-20 02:40:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:32.054768 | orchestrator | 2026-04-20 02:40:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:32.054821 | orchestrator | 2026-04-20 02:40:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:35.105365 | orchestrator | 2026-04-20 02:40:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:35.106982 | orchestrator | 2026-04-20 02:40:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:35.107016 | orchestrator | 2026-04-20 02:40:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:38.147153 | orchestrator | 2026-04-20 02:40:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:38.149142 | orchestrator | 2026-04-20 02:40:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:38.149258 | orchestrator | 2026-04-20 02:40:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:41.198354 | orchestrator | 2026-04-20 02:40:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:41.200079 | orchestrator | 2026-04-20 02:40:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:41.200132 | orchestrator | 2026-04-20 02:40:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:44.244738 | orchestrator | 2026-04-20 02:40:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:44.247375 | orchestrator | 2026-04-20 02:40:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:44.247497 | orchestrator | 2026-04-20 02:40:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:47.294309 | orchestrator | 2026-04-20 02:40:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:47.296331 | orchestrator | 2026-04-20 02:40:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:47.296489 | orchestrator | 2026-04-20 02:40:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:50.347375 | orchestrator | 2026-04-20 02:40:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:50.349585 | orchestrator | 2026-04-20 02:40:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:50.349651 | orchestrator | 2026-04-20 02:40:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:53.394210 | orchestrator | 2026-04-20 02:40:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:53.395730 | orchestrator | 2026-04-20 02:40:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:53.395781 | orchestrator | 2026-04-20 02:40:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:56.441576 | orchestrator | 2026-04-20 02:40:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:56.443248 | orchestrator | 2026-04-20 02:40:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:56.443283 | orchestrator | 2026-04-20 02:40:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:40:59.493421 | orchestrator | 2026-04-20 02:40:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:40:59.495291 | orchestrator | 2026-04-20 02:40:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:40:59.495342 | orchestrator | 2026-04-20 02:40:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:02.535762 | orchestrator | 2026-04-20 02:41:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:02.537398 | orchestrator | 2026-04-20 02:41:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:02.537635 | orchestrator | 2026-04-20 02:41:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:05.582382 | orchestrator | 2026-04-20 02:41:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:05.584073 | orchestrator | 2026-04-20 02:41:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:05.584182 | orchestrator | 2026-04-20 02:41:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:08.629621 | orchestrator | 2026-04-20 02:41:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:08.631733 | orchestrator | 2026-04-20 02:41:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:08.631851 | orchestrator | 2026-04-20 02:41:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:11.676109 | orchestrator | 2026-04-20 02:41:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:11.677181 | orchestrator | 2026-04-20 02:41:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:11.677385 | orchestrator | 2026-04-20 02:41:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:14.727517 | orchestrator | 2026-04-20 02:41:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:14.729746 | orchestrator | 2026-04-20 02:41:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:14.729930 | orchestrator | 2026-04-20 02:41:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:17.781039 | orchestrator | 2026-04-20 02:41:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:17.782478 | orchestrator | 2026-04-20 02:41:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:17.782579 | orchestrator | 2026-04-20 02:41:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:20.822653 | orchestrator | 2026-04-20 02:41:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:20.824027 | orchestrator | 2026-04-20 02:41:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:20.824059 | orchestrator | 2026-04-20 02:41:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:23.871673 | orchestrator | 2026-04-20 02:41:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:23.873181 | orchestrator | 2026-04-20 02:41:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:23.873227 | orchestrator | 2026-04-20 02:41:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:26.916383 | orchestrator | 2026-04-20 02:41:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:26.917997 | orchestrator | 2026-04-20 02:41:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:26.918195 | orchestrator | 2026-04-20 02:41:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:29.974134 | orchestrator | 2026-04-20 02:41:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:29.975983 | orchestrator | 2026-04-20 02:41:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:29.976022 | orchestrator | 2026-04-20 02:41:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:33.021481 | orchestrator | 2026-04-20 02:41:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:33.023921 | orchestrator | 2026-04-20 02:41:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:33.024209 | orchestrator | 2026-04-20 02:41:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:36.074844 | orchestrator | 2026-04-20 02:41:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:36.076634 | orchestrator | 2026-04-20 02:41:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:36.076681 | orchestrator | 2026-04-20 02:41:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:39.122171 | orchestrator | 2026-04-20 02:41:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:39.124042 | orchestrator | 2026-04-20 02:41:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:39.124138 | orchestrator | 2026-04-20 02:41:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:42.172877 | orchestrator | 2026-04-20 02:41:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:42.174345 | orchestrator | 2026-04-20 02:41:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:42.174398 | orchestrator | 2026-04-20 02:41:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:45.217236 | orchestrator | 2026-04-20 02:41:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:45.218163 | orchestrator | 2026-04-20 02:41:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:45.218237 | orchestrator | 2026-04-20 02:41:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:48.269495 | orchestrator | 2026-04-20 02:41:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:48.271599 | orchestrator | 2026-04-20 02:41:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:48.271673 | orchestrator | 2026-04-20 02:41:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:51.315270 | orchestrator | 2026-04-20 02:41:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:51.318590 | orchestrator | 2026-04-20 02:41:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:51.318654 | orchestrator | 2026-04-20 02:41:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:54.369130 | orchestrator | 2026-04-20 02:41:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:54.369989 | orchestrator | 2026-04-20 02:41:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:54.370117 | orchestrator | 2026-04-20 02:41:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:41:57.407118 | orchestrator | 2026-04-20 02:41:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:41:57.409467 | orchestrator | 2026-04-20 02:41:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:41:57.409764 | orchestrator | 2026-04-20 02:41:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:00.458493 | orchestrator | 2026-04-20 02:42:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:00.461533 | orchestrator | 2026-04-20 02:42:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:00.461595 | orchestrator | 2026-04-20 02:42:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:03.506474 | orchestrator | 2026-04-20 02:42:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:03.507084 | orchestrator | 2026-04-20 02:42:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:03.507115 | orchestrator | 2026-04-20 02:42:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:06.563126 | orchestrator | 2026-04-20 02:42:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:06.564487 | orchestrator | 2026-04-20 02:42:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:06.564558 | orchestrator | 2026-04-20 02:42:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:09.614349 | orchestrator | 2026-04-20 02:42:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:09.615880 | orchestrator | 2026-04-20 02:42:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:09.615930 | orchestrator | 2026-04-20 02:42:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:12.664792 | orchestrator | 2026-04-20 02:42:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:12.666278 | orchestrator | 2026-04-20 02:42:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:12.666429 | orchestrator | 2026-04-20 02:42:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:15.709882 | orchestrator | 2026-04-20 02:42:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:15.711341 | orchestrator | 2026-04-20 02:42:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:15.711394 | orchestrator | 2026-04-20 02:42:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:18.757016 | orchestrator | 2026-04-20 02:42:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:18.758105 | orchestrator | 2026-04-20 02:42:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:18.758153 | orchestrator | 2026-04-20 02:42:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:21.806658 | orchestrator | 2026-04-20 02:42:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:21.809102 | orchestrator | 2026-04-20 02:42:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:21.809170 | orchestrator | 2026-04-20 02:42:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:24.849636 | orchestrator | 2026-04-20 02:42:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:24.850513 | orchestrator | 2026-04-20 02:42:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:24.850566 | orchestrator | 2026-04-20 02:42:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:27.897138 | orchestrator | 2026-04-20 02:42:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:27.899010 | orchestrator | 2026-04-20 02:42:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:27.899073 | orchestrator | 2026-04-20 02:42:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:30.952289 | orchestrator | 2026-04-20 02:42:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:30.954239 | orchestrator | 2026-04-20 02:42:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:30.954289 | orchestrator | 2026-04-20 02:42:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:34.009384 | orchestrator | 2026-04-20 02:42:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:34.011970 | orchestrator | 2026-04-20 02:42:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:34.012123 | orchestrator | 2026-04-20 02:42:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:37.066934 | orchestrator | 2026-04-20 02:42:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:37.068079 | orchestrator | 2026-04-20 02:42:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:37.068138 | orchestrator | 2026-04-20 02:42:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:40.111247 | orchestrator | 2026-04-20 02:42:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:40.111586 | orchestrator | 2026-04-20 02:42:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:40.111616 | orchestrator | 2026-04-20 02:42:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:43.155959 | orchestrator | 2026-04-20 02:42:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:43.157379 | orchestrator | 2026-04-20 02:42:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:43.157618 | orchestrator | 2026-04-20 02:42:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:46.203832 | orchestrator | 2026-04-20 02:42:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:46.205419 | orchestrator | 2026-04-20 02:42:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:46.205500 | orchestrator | 2026-04-20 02:42:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:49.254175 | orchestrator | 2026-04-20 02:42:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:49.256330 | orchestrator | 2026-04-20 02:42:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:49.256426 | orchestrator | 2026-04-20 02:42:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:52.308590 | orchestrator | 2026-04-20 02:42:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:52.310262 | orchestrator | 2026-04-20 02:42:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:52.310309 | orchestrator | 2026-04-20 02:42:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:55.351347 | orchestrator | 2026-04-20 02:42:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:55.354342 | orchestrator | 2026-04-20 02:42:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:55.354422 | orchestrator | 2026-04-20 02:42:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:42:58.403746 | orchestrator | 2026-04-20 02:42:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:42:58.404856 | orchestrator | 2026-04-20 02:42:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:42:58.404931 | orchestrator | 2026-04-20 02:42:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:01.455914 | orchestrator | 2026-04-20 02:43:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:01.457496 | orchestrator | 2026-04-20 02:43:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:01.457558 | orchestrator | 2026-04-20 02:43:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:04.505345 | orchestrator | 2026-04-20 02:43:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:04.506394 | orchestrator | 2026-04-20 02:43:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:04.506446 | orchestrator | 2026-04-20 02:43:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:07.552155 | orchestrator | 2026-04-20 02:43:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:07.554193 | orchestrator | 2026-04-20 02:43:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:07.554268 | orchestrator | 2026-04-20 02:43:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:10.601635 | orchestrator | 2026-04-20 02:43:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:10.603916 | orchestrator | 2026-04-20 02:43:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:10.603990 | orchestrator | 2026-04-20 02:43:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:13.647593 | orchestrator | 2026-04-20 02:43:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:13.649100 | orchestrator | 2026-04-20 02:43:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:13.649261 | orchestrator | 2026-04-20 02:43:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:16.693934 | orchestrator | 2026-04-20 02:43:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:16.695890 | orchestrator | 2026-04-20 02:43:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:16.695965 | orchestrator | 2026-04-20 02:43:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:19.744320 | orchestrator | 2026-04-20 02:43:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:19.747854 | orchestrator | 2026-04-20 02:43:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:19.747954 | orchestrator | 2026-04-20 02:43:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:22.798473 | orchestrator | 2026-04-20 02:43:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:22.799911 | orchestrator | 2026-04-20 02:43:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:22.799938 | orchestrator | 2026-04-20 02:43:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:25.846112 | orchestrator | 2026-04-20 02:43:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:25.848066 | orchestrator | 2026-04-20 02:43:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:25.848185 | orchestrator | 2026-04-20 02:43:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:28.891757 | orchestrator | 2026-04-20 02:43:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:28.894461 | orchestrator | 2026-04-20 02:43:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:28.894537 | orchestrator | 2026-04-20 02:43:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:31.945355 | orchestrator | 2026-04-20 02:43:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:31.945747 | orchestrator | 2026-04-20 02:43:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:31.945782 | orchestrator | 2026-04-20 02:43:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:34.986969 | orchestrator | 2026-04-20 02:43:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:34.987317 | orchestrator | 2026-04-20 02:43:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:34.987361 | orchestrator | 2026-04-20 02:43:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:38.042991 | orchestrator | 2026-04-20 02:43:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:38.046280 | orchestrator | 2026-04-20 02:43:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:38.046342 | orchestrator | 2026-04-20 02:43:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:41.096370 | orchestrator | 2026-04-20 02:43:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:41.099878 | orchestrator | 2026-04-20 02:43:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:41.100039 | orchestrator | 2026-04-20 02:43:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:44.149137 | orchestrator | 2026-04-20 02:43:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:44.150953 | orchestrator | 2026-04-20 02:43:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:44.150993 | orchestrator | 2026-04-20 02:43:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:47.196861 | orchestrator | 2026-04-20 02:43:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:47.199143 | orchestrator | 2026-04-20 02:43:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:47.199226 | orchestrator | 2026-04-20 02:43:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:50.247197 | orchestrator | 2026-04-20 02:43:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:50.249228 | orchestrator | 2026-04-20 02:43:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:50.250175 | orchestrator | 2026-04-20 02:43:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:53.297573 | orchestrator | 2026-04-20 02:43:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:53.299905 | orchestrator | 2026-04-20 02:43:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:53.299965 | orchestrator | 2026-04-20 02:43:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:56.351999 | orchestrator | 2026-04-20 02:43:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:56.353881 | orchestrator | 2026-04-20 02:43:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:56.353927 | orchestrator | 2026-04-20 02:43:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:43:59.400838 | orchestrator | 2026-04-20 02:43:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:43:59.402215 | orchestrator | 2026-04-20 02:43:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:43:59.402281 | orchestrator | 2026-04-20 02:43:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:02.454290 | orchestrator | 2026-04-20 02:44:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:02.455927 | orchestrator | 2026-04-20 02:44:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:02.455954 | orchestrator | 2026-04-20 02:44:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:05.508085 | orchestrator | 2026-04-20 02:44:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:05.510466 | orchestrator | 2026-04-20 02:44:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:05.510503 | orchestrator | 2026-04-20 02:44:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:08.559931 | orchestrator | 2026-04-20 02:44:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:08.561917 | orchestrator | 2026-04-20 02:44:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:08.561981 | orchestrator | 2026-04-20 02:44:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:11.615029 | orchestrator | 2026-04-20 02:44:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:11.616448 | orchestrator | 2026-04-20 02:44:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:11.616508 | orchestrator | 2026-04-20 02:44:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:14.667158 | orchestrator | 2026-04-20 02:44:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:14.669196 | orchestrator | 2026-04-20 02:44:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:14.669264 | orchestrator | 2026-04-20 02:44:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:17.706345 | orchestrator | 2026-04-20 02:44:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:17.707001 | orchestrator | 2026-04-20 02:44:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:17.707071 | orchestrator | 2026-04-20 02:44:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:20.759904 | orchestrator | 2026-04-20 02:44:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:20.761887 | orchestrator | 2026-04-20 02:44:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:20.762082 | orchestrator | 2026-04-20 02:44:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:23.805999 | orchestrator | 2026-04-20 02:44:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:23.807042 | orchestrator | 2026-04-20 02:44:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:23.807091 | orchestrator | 2026-04-20 02:44:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:26.853188 | orchestrator | 2026-04-20 02:44:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:26.856001 | orchestrator | 2026-04-20 02:44:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:26.856349 | orchestrator | 2026-04-20 02:44:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:29.899773 | orchestrator | 2026-04-20 02:44:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:29.902542 | orchestrator | 2026-04-20 02:44:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:29.902620 | orchestrator | 2026-04-20 02:44:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:32.948496 | orchestrator | 2026-04-20 02:44:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:32.949217 | orchestrator | 2026-04-20 02:44:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:32.949384 | orchestrator | 2026-04-20 02:44:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:36.003792 | orchestrator | 2026-04-20 02:44:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:36.006283 | orchestrator | 2026-04-20 02:44:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:36.006340 | orchestrator | 2026-04-20 02:44:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:39.053457 | orchestrator | 2026-04-20 02:44:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:39.055169 | orchestrator | 2026-04-20 02:44:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:39.055273 | orchestrator | 2026-04-20 02:44:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:42.090896 | orchestrator | 2026-04-20 02:44:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:42.092147 | orchestrator | 2026-04-20 02:44:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:42.092205 | orchestrator | 2026-04-20 02:44:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:45.131651 | orchestrator | 2026-04-20 02:44:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:45.133124 | orchestrator | 2026-04-20 02:44:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:45.133277 | orchestrator | 2026-04-20 02:44:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:48.183262 | orchestrator | 2026-04-20 02:44:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:48.185532 | orchestrator | 2026-04-20 02:44:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:48.185616 | orchestrator | 2026-04-20 02:44:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:51.229983 | orchestrator | 2026-04-20 02:44:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:51.231763 | orchestrator | 2026-04-20 02:44:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:51.231805 | orchestrator | 2026-04-20 02:44:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:54.276099 | orchestrator | 2026-04-20 02:44:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:54.277404 | orchestrator | 2026-04-20 02:44:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:54.277456 | orchestrator | 2026-04-20 02:44:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:44:57.319685 | orchestrator | 2026-04-20 02:44:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:44:57.322087 | orchestrator | 2026-04-20 02:44:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:44:57.322148 | orchestrator | 2026-04-20 02:44:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:00.363724 | orchestrator | 2026-04-20 02:45:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:00.365988 | orchestrator | 2026-04-20 02:45:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:00.366201 | orchestrator | 2026-04-20 02:45:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:03.417106 | orchestrator | 2026-04-20 02:45:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:03.419503 | orchestrator | 2026-04-20 02:45:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:03.420129 | orchestrator | 2026-04-20 02:45:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:06.465535 | orchestrator | 2026-04-20 02:45:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:06.467160 | orchestrator | 2026-04-20 02:45:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:06.467206 | orchestrator | 2026-04-20 02:45:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:09.518234 | orchestrator | 2026-04-20 02:45:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:09.519513 | orchestrator | 2026-04-20 02:45:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:09.519555 | orchestrator | 2026-04-20 02:45:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:12.570468 | orchestrator | 2026-04-20 02:45:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:12.572101 | orchestrator | 2026-04-20 02:45:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:12.572391 | orchestrator | 2026-04-20 02:45:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:15.622793 | orchestrator | 2026-04-20 02:45:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:15.624228 | orchestrator | 2026-04-20 02:45:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:15.624285 | orchestrator | 2026-04-20 02:45:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:18.673004 | orchestrator | 2026-04-20 02:45:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:18.677482 | orchestrator | 2026-04-20 02:45:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:18.677589 | orchestrator | 2026-04-20 02:45:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:21.727060 | orchestrator | 2026-04-20 02:45:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:21.729596 | orchestrator | 2026-04-20 02:45:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:21.729651 | orchestrator | 2026-04-20 02:45:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:24.780572 | orchestrator | 2026-04-20 02:45:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:24.782107 | orchestrator | 2026-04-20 02:45:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:24.782148 | orchestrator | 2026-04-20 02:45:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:27.819867 | orchestrator | 2026-04-20 02:45:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:27.821092 | orchestrator | 2026-04-20 02:45:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:27.821152 | orchestrator | 2026-04-20 02:45:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:30.869142 | orchestrator | 2026-04-20 02:45:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:30.870529 | orchestrator | 2026-04-20 02:45:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:30.870597 | orchestrator | 2026-04-20 02:45:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:33.908922 | orchestrator | 2026-04-20 02:45:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:33.910402 | orchestrator | 2026-04-20 02:45:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:33.910478 | orchestrator | 2026-04-20 02:45:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:36.954356 | orchestrator | 2026-04-20 02:45:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:36.955791 | orchestrator | 2026-04-20 02:45:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:36.955827 | orchestrator | 2026-04-20 02:45:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:40.011558 | orchestrator | 2026-04-20 02:45:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:40.013919 | orchestrator | 2026-04-20 02:45:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:40.015323 | orchestrator | 2026-04-20 02:45:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:43.061277 | orchestrator | 2026-04-20 02:45:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:43.063916 | orchestrator | 2026-04-20 02:45:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:43.063980 | orchestrator | 2026-04-20 02:45:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:46.115599 | orchestrator | 2026-04-20 02:45:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:46.118376 | orchestrator | 2026-04-20 02:45:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:46.118408 | orchestrator | 2026-04-20 02:45:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:49.165055 | orchestrator | 2026-04-20 02:45:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:49.166925 | orchestrator | 2026-04-20 02:45:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:49.166980 | orchestrator | 2026-04-20 02:45:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:52.222534 | orchestrator | 2026-04-20 02:45:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:52.224893 | orchestrator | 2026-04-20 02:45:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:52.224946 | orchestrator | 2026-04-20 02:45:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:55.272792 | orchestrator | 2026-04-20 02:45:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:55.274496 | orchestrator | 2026-04-20 02:45:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:55.274583 | orchestrator | 2026-04-20 02:45:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:45:58.324355 | orchestrator | 2026-04-20 02:45:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:45:58.326225 | orchestrator | 2026-04-20 02:45:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:45:58.326260 | orchestrator | 2026-04-20 02:45:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:01.375638 | orchestrator | 2026-04-20 02:46:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:01.383448 | orchestrator | 2026-04-20 02:46:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:01.383539 | orchestrator | 2026-04-20 02:46:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:04.429290 | orchestrator | 2026-04-20 02:46:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:04.431219 | orchestrator | 2026-04-20 02:46:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:04.431468 | orchestrator | 2026-04-20 02:46:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:07.482291 | orchestrator | 2026-04-20 02:46:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:07.484927 | orchestrator | 2026-04-20 02:46:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:07.484985 | orchestrator | 2026-04-20 02:46:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:10.537143 | orchestrator | 2026-04-20 02:46:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:10.538731 | orchestrator | 2026-04-20 02:46:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:10.538808 | orchestrator | 2026-04-20 02:46:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:13.585088 | orchestrator | 2026-04-20 02:46:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:13.586195 | orchestrator | 2026-04-20 02:46:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:13.586273 | orchestrator | 2026-04-20 02:46:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:16.634263 | orchestrator | 2026-04-20 02:46:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:16.637208 | orchestrator | 2026-04-20 02:46:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:16.637298 | orchestrator | 2026-04-20 02:46:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:19.686377 | orchestrator | 2026-04-20 02:46:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:19.688447 | orchestrator | 2026-04-20 02:46:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:19.688564 | orchestrator | 2026-04-20 02:46:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:22.745110 | orchestrator | 2026-04-20 02:46:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:22.748136 | orchestrator | 2026-04-20 02:46:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:22.748198 | orchestrator | 2026-04-20 02:46:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:25.800485 | orchestrator | 2026-04-20 02:46:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:25.805175 | orchestrator | 2026-04-20 02:46:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:25.805270 | orchestrator | 2026-04-20 02:46:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:28.859610 | orchestrator | 2026-04-20 02:46:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:28.861574 | orchestrator | 2026-04-20 02:46:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:28.861619 | orchestrator | 2026-04-20 02:46:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:31.905172 | orchestrator | 2026-04-20 02:46:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:31.906566 | orchestrator | 2026-04-20 02:46:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:31.906600 | orchestrator | 2026-04-20 02:46:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:34.959111 | orchestrator | 2026-04-20 02:46:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:34.960699 | orchestrator | 2026-04-20 02:46:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:34.960763 | orchestrator | 2026-04-20 02:46:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:38.011743 | orchestrator | 2026-04-20 02:46:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:38.012244 | orchestrator | 2026-04-20 02:46:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:38.012272 | orchestrator | 2026-04-20 02:46:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:41.049814 | orchestrator | 2026-04-20 02:46:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:41.051639 | orchestrator | 2026-04-20 02:46:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:41.051706 | orchestrator | 2026-04-20 02:46:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:44.101532 | orchestrator | 2026-04-20 02:46:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:44.103199 | orchestrator | 2026-04-20 02:46:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:44.103315 | orchestrator | 2026-04-20 02:46:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:47.143604 | orchestrator | 2026-04-20 02:46:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:47.144810 | orchestrator | 2026-04-20 02:46:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:47.144827 | orchestrator | 2026-04-20 02:46:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:50.185226 | orchestrator | 2026-04-20 02:46:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:50.187164 | orchestrator | 2026-04-20 02:46:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:50.187251 | orchestrator | 2026-04-20 02:46:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:53.224604 | orchestrator | 2026-04-20 02:46:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:53.227567 | orchestrator | 2026-04-20 02:46:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:53.227640 | orchestrator | 2026-04-20 02:46:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:56.272145 | orchestrator | 2026-04-20 02:46:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:56.273793 | orchestrator | 2026-04-20 02:46:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:56.273943 | orchestrator | 2026-04-20 02:46:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:46:59.318869 | orchestrator | 2026-04-20 02:46:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:46:59.320781 | orchestrator | 2026-04-20 02:46:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:46:59.320861 | orchestrator | 2026-04-20 02:46:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:02.371786 | orchestrator | 2026-04-20 02:47:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:02.373090 | orchestrator | 2026-04-20 02:47:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:02.373138 | orchestrator | 2026-04-20 02:47:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:05.429389 | orchestrator | 2026-04-20 02:47:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:05.431636 | orchestrator | 2026-04-20 02:47:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:05.431678 | orchestrator | 2026-04-20 02:47:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:08.474189 | orchestrator | 2026-04-20 02:47:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:08.475681 | orchestrator | 2026-04-20 02:47:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:08.476856 | orchestrator | 2026-04-20 02:47:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:11.524137 | orchestrator | 2026-04-20 02:47:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:11.524384 | orchestrator | 2026-04-20 02:47:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:11.524410 | orchestrator | 2026-04-20 02:47:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:14.572054 | orchestrator | 2026-04-20 02:47:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:14.574540 | orchestrator | 2026-04-20 02:47:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:14.574585 | orchestrator | 2026-04-20 02:47:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:17.612664 | orchestrator | 2026-04-20 02:47:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:17.612962 | orchestrator | 2026-04-20 02:47:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:17.612992 | orchestrator | 2026-04-20 02:47:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:20.658825 | orchestrator | 2026-04-20 02:47:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:20.661469 | orchestrator | 2026-04-20 02:47:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:20.661534 | orchestrator | 2026-04-20 02:47:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:23.707099 | orchestrator | 2026-04-20 02:47:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:23.709208 | orchestrator | 2026-04-20 02:47:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:23.709273 | orchestrator | 2026-04-20 02:47:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:26.761844 | orchestrator | 2026-04-20 02:47:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:26.764106 | orchestrator | 2026-04-20 02:47:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:26.764176 | orchestrator | 2026-04-20 02:47:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:29.816265 | orchestrator | 2026-04-20 02:47:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:29.818062 | orchestrator | 2026-04-20 02:47:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:29.818083 | orchestrator | 2026-04-20 02:47:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:32.868195 | orchestrator | 2026-04-20 02:47:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:32.869266 | orchestrator | 2026-04-20 02:47:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:32.869334 | orchestrator | 2026-04-20 02:47:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:35.917023 | orchestrator | 2026-04-20 02:47:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:35.919391 | orchestrator | 2026-04-20 02:47:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:35.919582 | orchestrator | 2026-04-20 02:47:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:38.963445 | orchestrator | 2026-04-20 02:47:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:38.965645 | orchestrator | 2026-04-20 02:47:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:38.965739 | orchestrator | 2026-04-20 02:47:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:42.012884 | orchestrator | 2026-04-20 02:47:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:42.015044 | orchestrator | 2026-04-20 02:47:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:42.015091 | orchestrator | 2026-04-20 02:47:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:45.064329 | orchestrator | 2026-04-20 02:47:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:45.065123 | orchestrator | 2026-04-20 02:47:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:45.065163 | orchestrator | 2026-04-20 02:47:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:48.117130 | orchestrator | 2026-04-20 02:47:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:48.119414 | orchestrator | 2026-04-20 02:47:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:48.119458 | orchestrator | 2026-04-20 02:47:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:51.165904 | orchestrator | 2026-04-20 02:47:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:51.167476 | orchestrator | 2026-04-20 02:47:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:51.167546 | orchestrator | 2026-04-20 02:47:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:54.200163 | orchestrator | 2026-04-20 02:47:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:54.202131 | orchestrator | 2026-04-20 02:47:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:54.202180 | orchestrator | 2026-04-20 02:47:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:47:57.245259 | orchestrator | 2026-04-20 02:47:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:47:57.246079 | orchestrator | 2026-04-20 02:47:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:47:57.246135 | orchestrator | 2026-04-20 02:47:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:00.293444 | orchestrator | 2026-04-20 02:48:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:00.295707 | orchestrator | 2026-04-20 02:48:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:00.295986 | orchestrator | 2026-04-20 02:48:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:03.340302 | orchestrator | 2026-04-20 02:48:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:03.341369 | orchestrator | 2026-04-20 02:48:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:03.341419 | orchestrator | 2026-04-20 02:48:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:06.387309 | orchestrator | 2026-04-20 02:48:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:06.388512 | orchestrator | 2026-04-20 02:48:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:06.388567 | orchestrator | 2026-04-20 02:48:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:09.436619 | orchestrator | 2026-04-20 02:48:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:09.438205 | orchestrator | 2026-04-20 02:48:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:09.438355 | orchestrator | 2026-04-20 02:48:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:12.483518 | orchestrator | 2026-04-20 02:48:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:12.485766 | orchestrator | 2026-04-20 02:48:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:12.485812 | orchestrator | 2026-04-20 02:48:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:15.531806 | orchestrator | 2026-04-20 02:48:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:15.533339 | orchestrator | 2026-04-20 02:48:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:15.533434 | orchestrator | 2026-04-20 02:48:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:18.582347 | orchestrator | 2026-04-20 02:48:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:18.582435 | orchestrator | 2026-04-20 02:48:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:18.582446 | orchestrator | 2026-04-20 02:48:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:21.631196 | orchestrator | 2026-04-20 02:48:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:21.633632 | orchestrator | 2026-04-20 02:48:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:21.633685 | orchestrator | 2026-04-20 02:48:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:24.689225 | orchestrator | 2026-04-20 02:48:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:24.690617 | orchestrator | 2026-04-20 02:48:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:24.690671 | orchestrator | 2026-04-20 02:48:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:27.740445 | orchestrator | 2026-04-20 02:48:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:27.741670 | orchestrator | 2026-04-20 02:48:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:27.741847 | orchestrator | 2026-04-20 02:48:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:30.789261 | orchestrator | 2026-04-20 02:48:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:30.791124 | orchestrator | 2026-04-20 02:48:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:30.791196 | orchestrator | 2026-04-20 02:48:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:33.853303 | orchestrator | 2026-04-20 02:48:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:33.855373 | orchestrator | 2026-04-20 02:48:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:33.855576 | orchestrator | 2026-04-20 02:48:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:36.904774 | orchestrator | 2026-04-20 02:48:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:36.907131 | orchestrator | 2026-04-20 02:48:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:36.907169 | orchestrator | 2026-04-20 02:48:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:39.958857 | orchestrator | 2026-04-20 02:48:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:39.959911 | orchestrator | 2026-04-20 02:48:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:39.959935 | orchestrator | 2026-04-20 02:48:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:43.013230 | orchestrator | 2026-04-20 02:48:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:43.014708 | orchestrator | 2026-04-20 02:48:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:43.014791 | orchestrator | 2026-04-20 02:48:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:46.072627 | orchestrator | 2026-04-20 02:48:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:46.075097 | orchestrator | 2026-04-20 02:48:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:46.075163 | orchestrator | 2026-04-20 02:48:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:49.122362 | orchestrator | 2026-04-20 02:48:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:49.124703 | orchestrator | 2026-04-20 02:48:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:49.124764 | orchestrator | 2026-04-20 02:48:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:52.167269 | orchestrator | 2026-04-20 02:48:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:52.169910 | orchestrator | 2026-04-20 02:48:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:52.170051 | orchestrator | 2026-04-20 02:48:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:55.210347 | orchestrator | 2026-04-20 02:48:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:55.213497 | orchestrator | 2026-04-20 02:48:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:55.213559 | orchestrator | 2026-04-20 02:48:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:48:58.255451 | orchestrator | 2026-04-20 02:48:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:48:58.256421 | orchestrator | 2026-04-20 02:48:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:48:58.256453 | orchestrator | 2026-04-20 02:48:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:01.298355 | orchestrator | 2026-04-20 02:49:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:01.298796 | orchestrator | 2026-04-20 02:49:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:01.298827 | orchestrator | 2026-04-20 02:49:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:04.348893 | orchestrator | 2026-04-20 02:49:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:04.349579 | orchestrator | 2026-04-20 02:49:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:04.349631 | orchestrator | 2026-04-20 02:49:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:07.399125 | orchestrator | 2026-04-20 02:49:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:07.403326 | orchestrator | 2026-04-20 02:49:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:07.403412 | orchestrator | 2026-04-20 02:49:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:10.444504 | orchestrator | 2026-04-20 02:49:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:10.447337 | orchestrator | 2026-04-20 02:49:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:10.447400 | orchestrator | 2026-04-20 02:49:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:13.496222 | orchestrator | 2026-04-20 02:49:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:13.497566 | orchestrator | 2026-04-20 02:49:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:13.497622 | orchestrator | 2026-04-20 02:49:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:16.551763 | orchestrator | 2026-04-20 02:49:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:16.553402 | orchestrator | 2026-04-20 02:49:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:16.553452 | orchestrator | 2026-04-20 02:49:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:19.607427 | orchestrator | 2026-04-20 02:49:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:19.609086 | orchestrator | 2026-04-20 02:49:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:19.609166 | orchestrator | 2026-04-20 02:49:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:22.659873 | orchestrator | 2026-04-20 02:49:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:22.662714 | orchestrator | 2026-04-20 02:49:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:22.662769 | orchestrator | 2026-04-20 02:49:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:25.712440 | orchestrator | 2026-04-20 02:49:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:25.713953 | orchestrator | 2026-04-20 02:49:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:25.714093 | orchestrator | 2026-04-20 02:49:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:28.759600 | orchestrator | 2026-04-20 02:49:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:28.761895 | orchestrator | 2026-04-20 02:49:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:28.761965 | orchestrator | 2026-04-20 02:49:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:31.806733 | orchestrator | 2026-04-20 02:49:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:31.808782 | orchestrator | 2026-04-20 02:49:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:31.808883 | orchestrator | 2026-04-20 02:49:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:34.852482 | orchestrator | 2026-04-20 02:49:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:34.853905 | orchestrator | 2026-04-20 02:49:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:34.853944 | orchestrator | 2026-04-20 02:49:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:37.890275 | orchestrator | 2026-04-20 02:49:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:37.893067 | orchestrator | 2026-04-20 02:49:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:37.893161 | orchestrator | 2026-04-20 02:49:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:40.935210 | orchestrator | 2026-04-20 02:49:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:40.936561 | orchestrator | 2026-04-20 02:49:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:40.936628 | orchestrator | 2026-04-20 02:49:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:43.981630 | orchestrator | 2026-04-20 02:49:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:43.983248 | orchestrator | 2026-04-20 02:49:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:43.983284 | orchestrator | 2026-04-20 02:49:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:47.039160 | orchestrator | 2026-04-20 02:49:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:47.041218 | orchestrator | 2026-04-20 02:49:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:47.041245 | orchestrator | 2026-04-20 02:49:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:50.089877 | orchestrator | 2026-04-20 02:49:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:50.091401 | orchestrator | 2026-04-20 02:49:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:50.091492 | orchestrator | 2026-04-20 02:49:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:53.143166 | orchestrator | 2026-04-20 02:49:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:53.145262 | orchestrator | 2026-04-20 02:49:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:53.145332 | orchestrator | 2026-04-20 02:49:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:56.193408 | orchestrator | 2026-04-20 02:49:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:56.195394 | orchestrator | 2026-04-20 02:49:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:56.195585 | orchestrator | 2026-04-20 02:49:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:49:59.241681 | orchestrator | 2026-04-20 02:49:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:49:59.242918 | orchestrator | 2026-04-20 02:49:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:49:59.242966 | orchestrator | 2026-04-20 02:49:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:02.295792 | orchestrator | 2026-04-20 02:50:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:02.298249 | orchestrator | 2026-04-20 02:50:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:02.298353 | orchestrator | 2026-04-20 02:50:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:05.341509 | orchestrator | 2026-04-20 02:50:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:05.344127 | orchestrator | 2026-04-20 02:50:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:05.344179 | orchestrator | 2026-04-20 02:50:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:08.389839 | orchestrator | 2026-04-20 02:50:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:08.391301 | orchestrator | 2026-04-20 02:50:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:08.391368 | orchestrator | 2026-04-20 02:50:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:11.436091 | orchestrator | 2026-04-20 02:50:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:11.438660 | orchestrator | 2026-04-20 02:50:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:11.438685 | orchestrator | 2026-04-20 02:50:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:14.483024 | orchestrator | 2026-04-20 02:50:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:14.483829 | orchestrator | 2026-04-20 02:50:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:14.483939 | orchestrator | 2026-04-20 02:50:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:17.533193 | orchestrator | 2026-04-20 02:50:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:17.534304 | orchestrator | 2026-04-20 02:50:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:17.534373 | orchestrator | 2026-04-20 02:50:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:20.577518 | orchestrator | 2026-04-20 02:50:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:20.579369 | orchestrator | 2026-04-20 02:50:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:20.579417 | orchestrator | 2026-04-20 02:50:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:23.630740 | orchestrator | 2026-04-20 02:50:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:23.632535 | orchestrator | 2026-04-20 02:50:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:23.632581 | orchestrator | 2026-04-20 02:50:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:26.685745 | orchestrator | 2026-04-20 02:50:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:26.686942 | orchestrator | 2026-04-20 02:50:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:26.687046 | orchestrator | 2026-04-20 02:50:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:29.737493 | orchestrator | 2026-04-20 02:50:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:29.738281 | orchestrator | 2026-04-20 02:50:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:29.738412 | orchestrator | 2026-04-20 02:50:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:32.786705 | orchestrator | 2026-04-20 02:50:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:32.788099 | orchestrator | 2026-04-20 02:50:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:32.788267 | orchestrator | 2026-04-20 02:50:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:35.833157 | orchestrator | 2026-04-20 02:50:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:35.834936 | orchestrator | 2026-04-20 02:50:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:35.834999 | orchestrator | 2026-04-20 02:50:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:38.884792 | orchestrator | 2026-04-20 02:50:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:38.885236 | orchestrator | 2026-04-20 02:50:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:38.885274 | orchestrator | 2026-04-20 02:50:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:41.931383 | orchestrator | 2026-04-20 02:50:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:41.934118 | orchestrator | 2026-04-20 02:50:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:41.934189 | orchestrator | 2026-04-20 02:50:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:44.982793 | orchestrator | 2026-04-20 02:50:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:44.985014 | orchestrator | 2026-04-20 02:50:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:44.985080 | orchestrator | 2026-04-20 02:50:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:48.031785 | orchestrator | 2026-04-20 02:50:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:48.033109 | orchestrator | 2026-04-20 02:50:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:48.033144 | orchestrator | 2026-04-20 02:50:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:51.084288 | orchestrator | 2026-04-20 02:50:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:51.085870 | orchestrator | 2026-04-20 02:50:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:51.085925 | orchestrator | 2026-04-20 02:50:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:54.131127 | orchestrator | 2026-04-20 02:50:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:54.132314 | orchestrator | 2026-04-20 02:50:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:54.132428 | orchestrator | 2026-04-20 02:50:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:50:57.186579 | orchestrator | 2026-04-20 02:50:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:50:57.188223 | orchestrator | 2026-04-20 02:50:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:50:57.188305 | orchestrator | 2026-04-20 02:50:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:00.234917 | orchestrator | 2026-04-20 02:51:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:00.236878 | orchestrator | 2026-04-20 02:51:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:00.236960 | orchestrator | 2026-04-20 02:51:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:03.293898 | orchestrator | 2026-04-20 02:51:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:03.294892 | orchestrator | 2026-04-20 02:51:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:03.295157 | orchestrator | 2026-04-20 02:51:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:06.347240 | orchestrator | 2026-04-20 02:51:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:06.348789 | orchestrator | 2026-04-20 02:51:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:06.348998 | orchestrator | 2026-04-20 02:51:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:09.396762 | orchestrator | 2026-04-20 02:51:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:09.397039 | orchestrator | 2026-04-20 02:51:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:09.397075 | orchestrator | 2026-04-20 02:51:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:12.448227 | orchestrator | 2026-04-20 02:51:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:12.449992 | orchestrator | 2026-04-20 02:51:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:12.450077 | orchestrator | 2026-04-20 02:51:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:15.496190 | orchestrator | 2026-04-20 02:51:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:15.498921 | orchestrator | 2026-04-20 02:51:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:15.499022 | orchestrator | 2026-04-20 02:51:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:18.547368 | orchestrator | 2026-04-20 02:51:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:18.548846 | orchestrator | 2026-04-20 02:51:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:18.548902 | orchestrator | 2026-04-20 02:51:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:21.594521 | orchestrator | 2026-04-20 02:51:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:21.595028 | orchestrator | 2026-04-20 02:51:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:21.595064 | orchestrator | 2026-04-20 02:51:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:24.641980 | orchestrator | 2026-04-20 02:51:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:24.643884 | orchestrator | 2026-04-20 02:51:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:24.643970 | orchestrator | 2026-04-20 02:51:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:27.702508 | orchestrator | 2026-04-20 02:51:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:27.703772 | orchestrator | 2026-04-20 02:51:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:27.703861 | orchestrator | 2026-04-20 02:51:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:30.758708 | orchestrator | 2026-04-20 02:51:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:30.760731 | orchestrator | 2026-04-20 02:51:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:30.761236 | orchestrator | 2026-04-20 02:51:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:33.811090 | orchestrator | 2026-04-20 02:51:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:33.812857 | orchestrator | 2026-04-20 02:51:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:33.812934 | orchestrator | 2026-04-20 02:51:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:36.864700 | orchestrator | 2026-04-20 02:51:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:36.866593 | orchestrator | 2026-04-20 02:51:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:36.866700 | orchestrator | 2026-04-20 02:51:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:39.915946 | orchestrator | 2026-04-20 02:51:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:39.918580 | orchestrator | 2026-04-20 02:51:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:39.918646 | orchestrator | 2026-04-20 02:51:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:42.967228 | orchestrator | 2026-04-20 02:51:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:42.968531 | orchestrator | 2026-04-20 02:51:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:42.968793 | orchestrator | 2026-04-20 02:51:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:46.017535 | orchestrator | 2026-04-20 02:51:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:46.019347 | orchestrator | 2026-04-20 02:51:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:46.019485 | orchestrator | 2026-04-20 02:51:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:49.070993 | orchestrator | 2026-04-20 02:51:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:49.072383 | orchestrator | 2026-04-20 02:51:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:49.072457 | orchestrator | 2026-04-20 02:51:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:52.111402 | orchestrator | 2026-04-20 02:51:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:52.113700 | orchestrator | 2026-04-20 02:51:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:52.113768 | orchestrator | 2026-04-20 02:51:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:55.170873 | orchestrator | 2026-04-20 02:51:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:55.173592 | orchestrator | 2026-04-20 02:51:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:55.173639 | orchestrator | 2026-04-20 02:51:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:51:58.223669 | orchestrator | 2026-04-20 02:51:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:51:58.226241 | orchestrator | 2026-04-20 02:51:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:51:58.226334 | orchestrator | 2026-04-20 02:51:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:01.274619 | orchestrator | 2026-04-20 02:52:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:01.276269 | orchestrator | 2026-04-20 02:52:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:01.276341 | orchestrator | 2026-04-20 02:52:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:04.317615 | orchestrator | 2026-04-20 02:52:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:04.321378 | orchestrator | 2026-04-20 02:52:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:04.321535 | orchestrator | 2026-04-20 02:52:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:07.373825 | orchestrator | 2026-04-20 02:52:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:07.377350 | orchestrator | 2026-04-20 02:52:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:07.377414 | orchestrator | 2026-04-20 02:52:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:10.431460 | orchestrator | 2026-04-20 02:52:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:10.433603 | orchestrator | 2026-04-20 02:52:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:10.433699 | orchestrator | 2026-04-20 02:52:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:13.488491 | orchestrator | 2026-04-20 02:52:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:13.489278 | orchestrator | 2026-04-20 02:52:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:13.490545 | orchestrator | 2026-04-20 02:52:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:16.537472 | orchestrator | 2026-04-20 02:52:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:16.539038 | orchestrator | 2026-04-20 02:52:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:16.539107 | orchestrator | 2026-04-20 02:52:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:19.590400 | orchestrator | 2026-04-20 02:52:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:19.591438 | orchestrator | 2026-04-20 02:52:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:19.591571 | orchestrator | 2026-04-20 02:52:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:22.641518 | orchestrator | 2026-04-20 02:52:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:22.643876 | orchestrator | 2026-04-20 02:52:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:22.643946 | orchestrator | 2026-04-20 02:52:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:25.691215 | orchestrator | 2026-04-20 02:52:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:25.693185 | orchestrator | 2026-04-20 02:52:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:25.693288 | orchestrator | 2026-04-20 02:52:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:28.744516 | orchestrator | 2026-04-20 02:52:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:28.747721 | orchestrator | 2026-04-20 02:52:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:28.747817 | orchestrator | 2026-04-20 02:52:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:31.801907 | orchestrator | 2026-04-20 02:52:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:31.803243 | orchestrator | 2026-04-20 02:52:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:31.803291 | orchestrator | 2026-04-20 02:52:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:34.851662 | orchestrator | 2026-04-20 02:52:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:34.854250 | orchestrator | 2026-04-20 02:52:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:34.854322 | orchestrator | 2026-04-20 02:52:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:37.906748 | orchestrator | 2026-04-20 02:52:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:37.908324 | orchestrator | 2026-04-20 02:52:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:37.908399 | orchestrator | 2026-04-20 02:52:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:40.952607 | orchestrator | 2026-04-20 02:52:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:40.953844 | orchestrator | 2026-04-20 02:52:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:40.953888 | orchestrator | 2026-04-20 02:52:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:44.009318 | orchestrator | 2026-04-20 02:52:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:44.010711 | orchestrator | 2026-04-20 02:52:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:44.010799 | orchestrator | 2026-04-20 02:52:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:47.056613 | orchestrator | 2026-04-20 02:52:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:47.059966 | orchestrator | 2026-04-20 02:52:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:47.060046 | orchestrator | 2026-04-20 02:52:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:50.112395 | orchestrator | 2026-04-20 02:52:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:50.114496 | orchestrator | 2026-04-20 02:52:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:50.114552 | orchestrator | 2026-04-20 02:52:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:53.162388 | orchestrator | 2026-04-20 02:52:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:53.164996 | orchestrator | 2026-04-20 02:52:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:53.165094 | orchestrator | 2026-04-20 02:52:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:56.217981 | orchestrator | 2026-04-20 02:52:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:56.219305 | orchestrator | 2026-04-20 02:52:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:56.219341 | orchestrator | 2026-04-20 02:52:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:52:59.268627 | orchestrator | 2026-04-20 02:52:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:52:59.269805 | orchestrator | 2026-04-20 02:52:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:52:59.269842 | orchestrator | 2026-04-20 02:52:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:53:02.315782 | orchestrator | 2026-04-20 02:53:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:53:02.318449 | orchestrator | 2026-04-20 02:53:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:53:02.318496 | orchestrator | 2026-04-20 02:53:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:53:05.364954 | orchestrator | 2026-04-20 02:53:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:53:05.367228 | orchestrator | 2026-04-20 02:53:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:53:05.367283 | orchestrator | 2026-04-20 02:53:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:53:08.412769 | orchestrator | 2026-04-20 02:53:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:08.524501 | orchestrator | 2026-04-20 02:55:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:08.524628 | orchestrator | 2026-04-20 02:55:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:11.574301 | orchestrator | 2026-04-20 02:55:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:11.576327 | orchestrator | 2026-04-20 02:55:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:11.576457 | orchestrator | 2026-04-20 02:55:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:14.624483 | orchestrator | 2026-04-20 02:55:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:14.624681 | orchestrator | 2026-04-20 02:55:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:14.624705 | orchestrator | 2026-04-20 02:55:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:17.669633 | orchestrator | 2026-04-20 02:55:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:17.671819 | orchestrator | 2026-04-20 02:55:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:17.671885 | orchestrator | 2026-04-20 02:55:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:20.717618 | orchestrator | 2026-04-20 02:55:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:20.719531 | orchestrator | 2026-04-20 02:55:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:20.719580 | orchestrator | 2026-04-20 02:55:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:23.765929 | orchestrator | 2026-04-20 02:55:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:23.767855 | orchestrator | 2026-04-20 02:55:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:23.767917 | orchestrator | 2026-04-20 02:55:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:26.815248 | orchestrator | 2026-04-20 02:55:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:26.816731 | orchestrator | 2026-04-20 02:55:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:26.816804 | orchestrator | 2026-04-20 02:55:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:29.867289 | orchestrator | 2026-04-20 02:55:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:29.869328 | orchestrator | 2026-04-20 02:55:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:29.869411 | orchestrator | 2026-04-20 02:55:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:32.913473 | orchestrator | 2026-04-20 02:55:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:32.914455 | orchestrator | 2026-04-20 02:55:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:32.914498 | orchestrator | 2026-04-20 02:55:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:35.960203 | orchestrator | 2026-04-20 02:55:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:35.962488 | orchestrator | 2026-04-20 02:55:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:35.962538 | orchestrator | 2026-04-20 02:55:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:39.005364 | orchestrator | 2026-04-20 02:55:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:39.007733 | orchestrator | 2026-04-20 02:55:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:39.007829 | orchestrator | 2026-04-20 02:55:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:42.061793 | orchestrator | 2026-04-20 02:55:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:42.064648 | orchestrator | 2026-04-20 02:55:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:42.064707 | orchestrator | 2026-04-20 02:55:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:45.111059 | orchestrator | 2026-04-20 02:55:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:45.115344 | orchestrator | 2026-04-20 02:55:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:45.115480 | orchestrator | 2026-04-20 02:55:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:48.163771 | orchestrator | 2026-04-20 02:55:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:48.166218 | orchestrator | 2026-04-20 02:55:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:48.166288 | orchestrator | 2026-04-20 02:55:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:51.205753 | orchestrator | 2026-04-20 02:55:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:51.207808 | orchestrator | 2026-04-20 02:55:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:51.207883 | orchestrator | 2026-04-20 02:55:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:54.248586 | orchestrator | 2026-04-20 02:55:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:54.250211 | orchestrator | 2026-04-20 02:55:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:54.250285 | orchestrator | 2026-04-20 02:55:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:55:57.287863 | orchestrator | 2026-04-20 02:55:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:55:57.289385 | orchestrator | 2026-04-20 02:55:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:55:57.289445 | orchestrator | 2026-04-20 02:55:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:00.330234 | orchestrator | 2026-04-20 02:56:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:00.332281 | orchestrator | 2026-04-20 02:56:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:00.332373 | orchestrator | 2026-04-20 02:56:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:03.374729 | orchestrator | 2026-04-20 02:56:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:03.376392 | orchestrator | 2026-04-20 02:56:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:03.376454 | orchestrator | 2026-04-20 02:56:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:06.413672 | orchestrator | 2026-04-20 02:56:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:06.415778 | orchestrator | 2026-04-20 02:56:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:06.415826 | orchestrator | 2026-04-20 02:56:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:09.448995 | orchestrator | 2026-04-20 02:56:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:09.450564 | orchestrator | 2026-04-20 02:56:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:09.450629 | orchestrator | 2026-04-20 02:56:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:12.495302 | orchestrator | 2026-04-20 02:56:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:12.497089 | orchestrator | 2026-04-20 02:56:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:12.497138 | orchestrator | 2026-04-20 02:56:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:15.542000 | orchestrator | 2026-04-20 02:56:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:15.543753 | orchestrator | 2026-04-20 02:56:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:15.543804 | orchestrator | 2026-04-20 02:56:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:18.588476 | orchestrator | 2026-04-20 02:56:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:18.590470 | orchestrator | 2026-04-20 02:56:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:18.590529 | orchestrator | 2026-04-20 02:56:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:21.634676 | orchestrator | 2026-04-20 02:56:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:21.636353 | orchestrator | 2026-04-20 02:56:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:21.636420 | orchestrator | 2026-04-20 02:56:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:24.675552 | orchestrator | 2026-04-20 02:56:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:24.677170 | orchestrator | 2026-04-20 02:56:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:24.677246 | orchestrator | 2026-04-20 02:56:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:27.719351 | orchestrator | 2026-04-20 02:56:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:27.720969 | orchestrator | 2026-04-20 02:56:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:27.721020 | orchestrator | 2026-04-20 02:56:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:30.772937 | orchestrator | 2026-04-20 02:56:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:30.774727 | orchestrator | 2026-04-20 02:56:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:30.774780 | orchestrator | 2026-04-20 02:56:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:33.814735 | orchestrator | 2026-04-20 02:56:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:33.816235 | orchestrator | 2026-04-20 02:56:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:33.816398 | orchestrator | 2026-04-20 02:56:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:36.849618 | orchestrator | 2026-04-20 02:56:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:36.852682 | orchestrator | 2026-04-20 02:56:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:36.852904 | orchestrator | 2026-04-20 02:56:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:39.901138 | orchestrator | 2026-04-20 02:56:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:39.906423 | orchestrator | 2026-04-20 02:56:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:39.906556 | orchestrator | 2026-04-20 02:56:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:42.949257 | orchestrator | 2026-04-20 02:56:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:42.950085 | orchestrator | 2026-04-20 02:56:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:42.950129 | orchestrator | 2026-04-20 02:56:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:45.989060 | orchestrator | 2026-04-20 02:56:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:45.989612 | orchestrator | 2026-04-20 02:56:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:45.989653 | orchestrator | 2026-04-20 02:56:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:49.023586 | orchestrator | 2026-04-20 02:56:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:49.024500 | orchestrator | 2026-04-20 02:56:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:49.024557 | orchestrator | 2026-04-20 02:56:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:52.068738 | orchestrator | 2026-04-20 02:56:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:52.070520 | orchestrator | 2026-04-20 02:56:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:52.070622 | orchestrator | 2026-04-20 02:56:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:55.116890 | orchestrator | 2026-04-20 02:56:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:55.118562 | orchestrator | 2026-04-20 02:56:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:55.118704 | orchestrator | 2026-04-20 02:56:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:56:58.162229 | orchestrator | 2026-04-20 02:56:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:56:58.164434 | orchestrator | 2026-04-20 02:56:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:56:58.164496 | orchestrator | 2026-04-20 02:56:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:01.204287 | orchestrator | 2026-04-20 02:57:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:01.206842 | orchestrator | 2026-04-20 02:57:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:01.206994 | orchestrator | 2026-04-20 02:57:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:04.250214 | orchestrator | 2026-04-20 02:57:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:04.251188 | orchestrator | 2026-04-20 02:57:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:04.251237 | orchestrator | 2026-04-20 02:57:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:07.298870 | orchestrator | 2026-04-20 02:57:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:07.300084 | orchestrator | 2026-04-20 02:57:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:07.300225 | orchestrator | 2026-04-20 02:57:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:10.333294 | orchestrator | 2026-04-20 02:57:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:10.336312 | orchestrator | 2026-04-20 02:57:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:10.336755 | orchestrator | 2026-04-20 02:57:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:13.377031 | orchestrator | 2026-04-20 02:57:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:13.380008 | orchestrator | 2026-04-20 02:57:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:13.380080 | orchestrator | 2026-04-20 02:57:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:16.426183 | orchestrator | 2026-04-20 02:57:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:16.428348 | orchestrator | 2026-04-20 02:57:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:16.428403 | orchestrator | 2026-04-20 02:57:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:19.475898 | orchestrator | 2026-04-20 02:57:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:19.478449 | orchestrator | 2026-04-20 02:57:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:19.478576 | orchestrator | 2026-04-20 02:57:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:22.525185 | orchestrator | 2026-04-20 02:57:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:22.526702 | orchestrator | 2026-04-20 02:57:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:22.526763 | orchestrator | 2026-04-20 02:57:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:25.567704 | orchestrator | 2026-04-20 02:57:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:25.567908 | orchestrator | 2026-04-20 02:57:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:25.567932 | orchestrator | 2026-04-20 02:57:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:28.607417 | orchestrator | 2026-04-20 02:57:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:28.610060 | orchestrator | 2026-04-20 02:57:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:28.610131 | orchestrator | 2026-04-20 02:57:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:31.657535 | orchestrator | 2026-04-20 02:57:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:31.659295 | orchestrator | 2026-04-20 02:57:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:31.659325 | orchestrator | 2026-04-20 02:57:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:34.704047 | orchestrator | 2026-04-20 02:57:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:34.706120 | orchestrator | 2026-04-20 02:57:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:34.706255 | orchestrator | 2026-04-20 02:57:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:37.749925 | orchestrator | 2026-04-20 02:57:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:37.751697 | orchestrator | 2026-04-20 02:57:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:37.751756 | orchestrator | 2026-04-20 02:57:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:40.797062 | orchestrator | 2026-04-20 02:57:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:40.798500 | orchestrator | 2026-04-20 02:57:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:40.798583 | orchestrator | 2026-04-20 02:57:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:43.848158 | orchestrator | 2026-04-20 02:57:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:43.850207 | orchestrator | 2026-04-20 02:57:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:43.850269 | orchestrator | 2026-04-20 02:57:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:46.899877 | orchestrator | 2026-04-20 02:57:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:46.901158 | orchestrator | 2026-04-20 02:57:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:46.901200 | orchestrator | 2026-04-20 02:57:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:49.951287 | orchestrator | 2026-04-20 02:57:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:49.954128 | orchestrator | 2026-04-20 02:57:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:49.954558 | orchestrator | 2026-04-20 02:57:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:53.003181 | orchestrator | 2026-04-20 02:57:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:53.006636 | orchestrator | 2026-04-20 02:57:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:53.006717 | orchestrator | 2026-04-20 02:57:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:56.048921 | orchestrator | 2026-04-20 02:57:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:56.049981 | orchestrator | 2026-04-20 02:57:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:56.050117 | orchestrator | 2026-04-20 02:57:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:57:59.085332 | orchestrator | 2026-04-20 02:57:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:57:59.087433 | orchestrator | 2026-04-20 02:57:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:57:59.087503 | orchestrator | 2026-04-20 02:57:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:02.138643 | orchestrator | 2026-04-20 02:58:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:02.139747 | orchestrator | 2026-04-20 02:58:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:02.139804 | orchestrator | 2026-04-20 02:58:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:05.184072 | orchestrator | 2026-04-20 02:58:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:05.184923 | orchestrator | 2026-04-20 02:58:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:05.184998 | orchestrator | 2026-04-20 02:58:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:08.235013 | orchestrator | 2026-04-20 02:58:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:08.237639 | orchestrator | 2026-04-20 02:58:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:08.237738 | orchestrator | 2026-04-20 02:58:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:11.280439 | orchestrator | 2026-04-20 02:58:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:11.282086 | orchestrator | 2026-04-20 02:58:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:11.282142 | orchestrator | 2026-04-20 02:58:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:14.325215 | orchestrator | 2026-04-20 02:58:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:14.326741 | orchestrator | 2026-04-20 02:58:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:14.326775 | orchestrator | 2026-04-20 02:58:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:17.374121 | orchestrator | 2026-04-20 02:58:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:17.375319 | orchestrator | 2026-04-20 02:58:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:17.375361 | orchestrator | 2026-04-20 02:58:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:20.425975 | orchestrator | 2026-04-20 02:58:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:20.428605 | orchestrator | 2026-04-20 02:58:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:20.428700 | orchestrator | 2026-04-20 02:58:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:23.474397 | orchestrator | 2026-04-20 02:58:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:23.475828 | orchestrator | 2026-04-20 02:58:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:23.475900 | orchestrator | 2026-04-20 02:58:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:26.523474 | orchestrator | 2026-04-20 02:58:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:26.526505 | orchestrator | 2026-04-20 02:58:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:26.526675 | orchestrator | 2026-04-20 02:58:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:29.569755 | orchestrator | 2026-04-20 02:58:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:29.570832 | orchestrator | 2026-04-20 02:58:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:29.570861 | orchestrator | 2026-04-20 02:58:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:32.612257 | orchestrator | 2026-04-20 02:58:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:32.612736 | orchestrator | 2026-04-20 02:58:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:32.612786 | orchestrator | 2026-04-20 02:58:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:35.653754 | orchestrator | 2026-04-20 02:58:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:35.654729 | orchestrator | 2026-04-20 02:58:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:35.654938 | orchestrator | 2026-04-20 02:58:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:38.698658 | orchestrator | 2026-04-20 02:58:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:38.700674 | orchestrator | 2026-04-20 02:58:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:38.700715 | orchestrator | 2026-04-20 02:58:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:41.745338 | orchestrator | 2026-04-20 02:58:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:41.746468 | orchestrator | 2026-04-20 02:58:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:41.746528 | orchestrator | 2026-04-20 02:58:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:44.792059 | orchestrator | 2026-04-20 02:58:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:44.795948 | orchestrator | 2026-04-20 02:58:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:44.796026 | orchestrator | 2026-04-20 02:58:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:47.845979 | orchestrator | 2026-04-20 02:58:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:47.848260 | orchestrator | 2026-04-20 02:58:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:47.848312 | orchestrator | 2026-04-20 02:58:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:50.891781 | orchestrator | 2026-04-20 02:58:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:50.893809 | orchestrator | 2026-04-20 02:58:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:50.893848 | orchestrator | 2026-04-20 02:58:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:53.938525 | orchestrator | 2026-04-20 02:58:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:53.940414 | orchestrator | 2026-04-20 02:58:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:53.940448 | orchestrator | 2026-04-20 02:58:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:58:56.986924 | orchestrator | 2026-04-20 02:58:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:58:56.990333 | orchestrator | 2026-04-20 02:58:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:58:56.990380 | orchestrator | 2026-04-20 02:58:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:00.034633 | orchestrator | 2026-04-20 02:59:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:00.035910 | orchestrator | 2026-04-20 02:59:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:00.035963 | orchestrator | 2026-04-20 02:59:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:03.088836 | orchestrator | 2026-04-20 02:59:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:03.091386 | orchestrator | 2026-04-20 02:59:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:03.091416 | orchestrator | 2026-04-20 02:59:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:06.139158 | orchestrator | 2026-04-20 02:59:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:06.140579 | orchestrator | 2026-04-20 02:59:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:06.140673 | orchestrator | 2026-04-20 02:59:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:09.182056 | orchestrator | 2026-04-20 02:59:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:09.183610 | orchestrator | 2026-04-20 02:59:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:09.183687 | orchestrator | 2026-04-20 02:59:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:12.221694 | orchestrator | 2026-04-20 02:59:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:12.224310 | orchestrator | 2026-04-20 02:59:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:12.224368 | orchestrator | 2026-04-20 02:59:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:15.269025 | orchestrator | 2026-04-20 02:59:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:15.272227 | orchestrator | 2026-04-20 02:59:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:15.272287 | orchestrator | 2026-04-20 02:59:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:18.320565 | orchestrator | 2026-04-20 02:59:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:18.322543 | orchestrator | 2026-04-20 02:59:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:18.322579 | orchestrator | 2026-04-20 02:59:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:21.368600 | orchestrator | 2026-04-20 02:59:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:21.370413 | orchestrator | 2026-04-20 02:59:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:21.370455 | orchestrator | 2026-04-20 02:59:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:24.420076 | orchestrator | 2026-04-20 02:59:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:24.421744 | orchestrator | 2026-04-20 02:59:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:24.421815 | orchestrator | 2026-04-20 02:59:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:27.461141 | orchestrator | 2026-04-20 02:59:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:27.463446 | orchestrator | 2026-04-20 02:59:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:27.463543 | orchestrator | 2026-04-20 02:59:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:30.507214 | orchestrator | 2026-04-20 02:59:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:30.509623 | orchestrator | 2026-04-20 02:59:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:30.510092 | orchestrator | 2026-04-20 02:59:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:33.558432 | orchestrator | 2026-04-20 02:59:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:33.560549 | orchestrator | 2026-04-20 02:59:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:33.560636 | orchestrator | 2026-04-20 02:59:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:36.601657 | orchestrator | 2026-04-20 02:59:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:36.604422 | orchestrator | 2026-04-20 02:59:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:36.604750 | orchestrator | 2026-04-20 02:59:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:39.644655 | orchestrator | 2026-04-20 02:59:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:39.647722 | orchestrator | 2026-04-20 02:59:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:39.647791 | orchestrator | 2026-04-20 02:59:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:42.694769 | orchestrator | 2026-04-20 02:59:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:42.697267 | orchestrator | 2026-04-20 02:59:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:42.697355 | orchestrator | 2026-04-20 02:59:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:45.742122 | orchestrator | 2026-04-20 02:59:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:45.743740 | orchestrator | 2026-04-20 02:59:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:45.743827 | orchestrator | 2026-04-20 02:59:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:48.784404 | orchestrator | 2026-04-20 02:59:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:48.786820 | orchestrator | 2026-04-20 02:59:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:48.786863 | orchestrator | 2026-04-20 02:59:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:51.826481 | orchestrator | 2026-04-20 02:59:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:51.827749 | orchestrator | 2026-04-20 02:59:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:51.827795 | orchestrator | 2026-04-20 02:59:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:54.871782 | orchestrator | 2026-04-20 02:59:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:54.875154 | orchestrator | 2026-04-20 02:59:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:54.875209 | orchestrator | 2026-04-20 02:59:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 02:59:57.917424 | orchestrator | 2026-04-20 02:59:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 02:59:57.918694 | orchestrator | 2026-04-20 02:59:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 02:59:57.918746 | orchestrator | 2026-04-20 02:59:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:00.962605 | orchestrator | 2026-04-20 03:00:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:00.964449 | orchestrator | 2026-04-20 03:00:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:00.964741 | orchestrator | 2026-04-20 03:00:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:04.007124 | orchestrator | 2026-04-20 03:00:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:04.009428 | orchestrator | 2026-04-20 03:00:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:04.009515 | orchestrator | 2026-04-20 03:00:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:07.047946 | orchestrator | 2026-04-20 03:00:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:07.049769 | orchestrator | 2026-04-20 03:00:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:07.049827 | orchestrator | 2026-04-20 03:00:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:10.089771 | orchestrator | 2026-04-20 03:00:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:10.091229 | orchestrator | 2026-04-20 03:00:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:10.091300 | orchestrator | 2026-04-20 03:00:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:13.131523 | orchestrator | 2026-04-20 03:00:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:13.132686 | orchestrator | 2026-04-20 03:00:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:13.132727 | orchestrator | 2026-04-20 03:00:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:16.174272 | orchestrator | 2026-04-20 03:00:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:16.175710 | orchestrator | 2026-04-20 03:00:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:16.175758 | orchestrator | 2026-04-20 03:00:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:19.221940 | orchestrator | 2026-04-20 03:00:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:19.223408 | orchestrator | 2026-04-20 03:00:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:19.223608 | orchestrator | 2026-04-20 03:00:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:22.269447 | orchestrator | 2026-04-20 03:00:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:22.269594 | orchestrator | 2026-04-20 03:00:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:22.269613 | orchestrator | 2026-04-20 03:00:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:25.311979 | orchestrator | 2026-04-20 03:00:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:25.313312 | orchestrator | 2026-04-20 03:00:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:25.313358 | orchestrator | 2026-04-20 03:00:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:28.363335 | orchestrator | 2026-04-20 03:00:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:28.367431 | orchestrator | 2026-04-20 03:00:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:28.367512 | orchestrator | 2026-04-20 03:00:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:31.417521 | orchestrator | 2026-04-20 03:00:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:31.419327 | orchestrator | 2026-04-20 03:00:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:31.419430 | orchestrator | 2026-04-20 03:00:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:34.471155 | orchestrator | 2026-04-20 03:00:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:34.471278 | orchestrator | 2026-04-20 03:00:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:34.471737 | orchestrator | 2026-04-20 03:00:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:37.519192 | orchestrator | 2026-04-20 03:00:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:37.521514 | orchestrator | 2026-04-20 03:00:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:37.521565 | orchestrator | 2026-04-20 03:00:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:40.567615 | orchestrator | 2026-04-20 03:00:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:40.568436 | orchestrator | 2026-04-20 03:00:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:40.568521 | orchestrator | 2026-04-20 03:00:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:43.614881 | orchestrator | 2026-04-20 03:00:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:43.616529 | orchestrator | 2026-04-20 03:00:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:43.617164 | orchestrator | 2026-04-20 03:00:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:46.659045 | orchestrator | 2026-04-20 03:00:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:46.660445 | orchestrator | 2026-04-20 03:00:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:46.660491 | orchestrator | 2026-04-20 03:00:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:49.717316 | orchestrator | 2026-04-20 03:00:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:49.718511 | orchestrator | 2026-04-20 03:00:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:49.718548 | orchestrator | 2026-04-20 03:00:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:52.763884 | orchestrator | 2026-04-20 03:00:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:52.766004 | orchestrator | 2026-04-20 03:00:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:52.766157 | orchestrator | 2026-04-20 03:00:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:55.806320 | orchestrator | 2026-04-20 03:00:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:55.807970 | orchestrator | 2026-04-20 03:00:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:55.808084 | orchestrator | 2026-04-20 03:00:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:00:58.850956 | orchestrator | 2026-04-20 03:00:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:00:58.851641 | orchestrator | 2026-04-20 03:00:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:00:58.852013 | orchestrator | 2026-04-20 03:00:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:01.899187 | orchestrator | 2026-04-20 03:01:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:01.900118 | orchestrator | 2026-04-20 03:01:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:01.900196 | orchestrator | 2026-04-20 03:01:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:04.943560 | orchestrator | 2026-04-20 03:01:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:04.944573 | orchestrator | 2026-04-20 03:01:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:04.944638 | orchestrator | 2026-04-20 03:01:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:07.987585 | orchestrator | 2026-04-20 03:01:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:07.989304 | orchestrator | 2026-04-20 03:01:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:07.989397 | orchestrator | 2026-04-20 03:01:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:11.031472 | orchestrator | 2026-04-20 03:01:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:11.032328 | orchestrator | 2026-04-20 03:01:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:11.032576 | orchestrator | 2026-04-20 03:01:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:14.074229 | orchestrator | 2026-04-20 03:01:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:14.077187 | orchestrator | 2026-04-20 03:01:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:14.077403 | orchestrator | 2026-04-20 03:01:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:17.119302 | orchestrator | 2026-04-20 03:01:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:17.121536 | orchestrator | 2026-04-20 03:01:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:17.121584 | orchestrator | 2026-04-20 03:01:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:20.167826 | orchestrator | 2026-04-20 03:01:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:20.170251 | orchestrator | 2026-04-20 03:01:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:20.170715 | orchestrator | 2026-04-20 03:01:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:23.223514 | orchestrator | 2026-04-20 03:01:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:23.228157 | orchestrator | 2026-04-20 03:01:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:23.228230 | orchestrator | 2026-04-20 03:01:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:26.286462 | orchestrator | 2026-04-20 03:01:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:26.288277 | orchestrator | 2026-04-20 03:01:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:26.288458 | orchestrator | 2026-04-20 03:01:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:29.343287 | orchestrator | 2026-04-20 03:01:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:29.344161 | orchestrator | 2026-04-20 03:01:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:29.344194 | orchestrator | 2026-04-20 03:01:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:32.393440 | orchestrator | 2026-04-20 03:01:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:32.394700 | orchestrator | 2026-04-20 03:01:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:32.394763 | orchestrator | 2026-04-20 03:01:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:35.443906 | orchestrator | 2026-04-20 03:01:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:35.446213 | orchestrator | 2026-04-20 03:01:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:35.446292 | orchestrator | 2026-04-20 03:01:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:38.490297 | orchestrator | 2026-04-20 03:01:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:38.495221 | orchestrator | 2026-04-20 03:01:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:38.495320 | orchestrator | 2026-04-20 03:01:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:41.547927 | orchestrator | 2026-04-20 03:01:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:41.548601 | orchestrator | 2026-04-20 03:01:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:41.548627 | orchestrator | 2026-04-20 03:01:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:44.602329 | orchestrator | 2026-04-20 03:01:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:44.607339 | orchestrator | 2026-04-20 03:01:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:44.607467 | orchestrator | 2026-04-20 03:01:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:47.660980 | orchestrator | 2026-04-20 03:01:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:47.664801 | orchestrator | 2026-04-20 03:01:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:47.664845 | orchestrator | 2026-04-20 03:01:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:50.717296 | orchestrator | 2026-04-20 03:01:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:50.719153 | orchestrator | 2026-04-20 03:01:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:50.719213 | orchestrator | 2026-04-20 03:01:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:53.765682 | orchestrator | 2026-04-20 03:01:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:53.767469 | orchestrator | 2026-04-20 03:01:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:53.768045 | orchestrator | 2026-04-20 03:01:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:56.812627 | orchestrator | 2026-04-20 03:01:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:56.814371 | orchestrator | 2026-04-20 03:01:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:56.814441 | orchestrator | 2026-04-20 03:01:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:01:59.858864 | orchestrator | 2026-04-20 03:01:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:01:59.860438 | orchestrator | 2026-04-20 03:01:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:01:59.860615 | orchestrator | 2026-04-20 03:01:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:02.903900 | orchestrator | 2026-04-20 03:02:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:02.905594 | orchestrator | 2026-04-20 03:02:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:02.905667 | orchestrator | 2026-04-20 03:02:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:05.962829 | orchestrator | 2026-04-20 03:02:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:05.966320 | orchestrator | 2026-04-20 03:02:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:05.966384 | orchestrator | 2026-04-20 03:02:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:09.005787 | orchestrator | 2026-04-20 03:02:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:09.007634 | orchestrator | 2026-04-20 03:02:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:09.007714 | orchestrator | 2026-04-20 03:02:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:12.057693 | orchestrator | 2026-04-20 03:02:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:12.058620 | orchestrator | 2026-04-20 03:02:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:12.058666 | orchestrator | 2026-04-20 03:02:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:15.100480 | orchestrator | 2026-04-20 03:02:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:15.102594 | orchestrator | 2026-04-20 03:02:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:15.102665 | orchestrator | 2026-04-20 03:02:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:18.148537 | orchestrator | 2026-04-20 03:02:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:18.150968 | orchestrator | 2026-04-20 03:02:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:18.151031 | orchestrator | 2026-04-20 03:02:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:21.198665 | orchestrator | 2026-04-20 03:02:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:21.199931 | orchestrator | 2026-04-20 03:02:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:21.200021 | orchestrator | 2026-04-20 03:02:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:24.245953 | orchestrator | 2026-04-20 03:02:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:24.247567 | orchestrator | 2026-04-20 03:02:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:24.247627 | orchestrator | 2026-04-20 03:02:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:27.300755 | orchestrator | 2026-04-20 03:02:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:27.301880 | orchestrator | 2026-04-20 03:02:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:27.301933 | orchestrator | 2026-04-20 03:02:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:30.353498 | orchestrator | 2026-04-20 03:02:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:30.354095 | orchestrator | 2026-04-20 03:02:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:30.354809 | orchestrator | 2026-04-20 03:02:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:33.402629 | orchestrator | 2026-04-20 03:02:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:33.403856 | orchestrator | 2026-04-20 03:02:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:33.403895 | orchestrator | 2026-04-20 03:02:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:36.456603 | orchestrator | 2026-04-20 03:02:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:36.456901 | orchestrator | 2026-04-20 03:02:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:36.456915 | orchestrator | 2026-04-20 03:02:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:39.508361 | orchestrator | 2026-04-20 03:02:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:39.510996 | orchestrator | 2026-04-20 03:02:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:39.511353 | orchestrator | 2026-04-20 03:02:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:42.558461 | orchestrator | 2026-04-20 03:02:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:42.560569 | orchestrator | 2026-04-20 03:02:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:42.560603 | orchestrator | 2026-04-20 03:02:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:45.605433 | orchestrator | 2026-04-20 03:02:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:45.607110 | orchestrator | 2026-04-20 03:02:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:45.607153 | orchestrator | 2026-04-20 03:02:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:48.657497 | orchestrator | 2026-04-20 03:02:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:48.659944 | orchestrator | 2026-04-20 03:02:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:48.659999 | orchestrator | 2026-04-20 03:02:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:51.708483 | orchestrator | 2026-04-20 03:02:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:51.711140 | orchestrator | 2026-04-20 03:02:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:51.711220 | orchestrator | 2026-04-20 03:02:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:54.759552 | orchestrator | 2026-04-20 03:02:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:54.761596 | orchestrator | 2026-04-20 03:02:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:54.761656 | orchestrator | 2026-04-20 03:02:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:02:57.817886 | orchestrator | 2026-04-20 03:02:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:02:57.821694 | orchestrator | 2026-04-20 03:02:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:02:57.821760 | orchestrator | 2026-04-20 03:02:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:00.869881 | orchestrator | 2026-04-20 03:03:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:00.870475 | orchestrator | 2026-04-20 03:03:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:00.870552 | orchestrator | 2026-04-20 03:03:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:03.923069 | orchestrator | 2026-04-20 03:03:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:03.923768 | orchestrator | 2026-04-20 03:03:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:03.923830 | orchestrator | 2026-04-20 03:03:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:06.974378 | orchestrator | 2026-04-20 03:03:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:06.976155 | orchestrator | 2026-04-20 03:03:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:06.976216 | orchestrator | 2026-04-20 03:03:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:10.022347 | orchestrator | 2026-04-20 03:03:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:10.024755 | orchestrator | 2026-04-20 03:03:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:10.024795 | orchestrator | 2026-04-20 03:03:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:13.075685 | orchestrator | 2026-04-20 03:03:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:13.078067 | orchestrator | 2026-04-20 03:03:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:13.078120 | orchestrator | 2026-04-20 03:03:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:16.127136 | orchestrator | 2026-04-20 03:03:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:16.130498 | orchestrator | 2026-04-20 03:03:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:16.130573 | orchestrator | 2026-04-20 03:03:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:19.174384 | orchestrator | 2026-04-20 03:03:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:19.176570 | orchestrator | 2026-04-20 03:03:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:19.176626 | orchestrator | 2026-04-20 03:03:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:22.223410 | orchestrator | 2026-04-20 03:03:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:22.225976 | orchestrator | 2026-04-20 03:03:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:22.226171 | orchestrator | 2026-04-20 03:03:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:25.277708 | orchestrator | 2026-04-20 03:03:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:25.278504 | orchestrator | 2026-04-20 03:03:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:25.278607 | orchestrator | 2026-04-20 03:03:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:28.322628 | orchestrator | 2026-04-20 03:03:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:28.323857 | orchestrator | 2026-04-20 03:03:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:28.323893 | orchestrator | 2026-04-20 03:03:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:31.368105 | orchestrator | 2026-04-20 03:03:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:31.368920 | orchestrator | 2026-04-20 03:03:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:31.368954 | orchestrator | 2026-04-20 03:03:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:34.414784 | orchestrator | 2026-04-20 03:03:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:34.415710 | orchestrator | 2026-04-20 03:03:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:34.415847 | orchestrator | 2026-04-20 03:03:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:37.464118 | orchestrator | 2026-04-20 03:03:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:37.465643 | orchestrator | 2026-04-20 03:03:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:37.465695 | orchestrator | 2026-04-20 03:03:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:40.516619 | orchestrator | 2026-04-20 03:03:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:40.518898 | orchestrator | 2026-04-20 03:03:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:40.518981 | orchestrator | 2026-04-20 03:03:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:43.566361 | orchestrator | 2026-04-20 03:03:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:43.567912 | orchestrator | 2026-04-20 03:03:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:43.567998 | orchestrator | 2026-04-20 03:03:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:46.617387 | orchestrator | 2026-04-20 03:03:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:46.618968 | orchestrator | 2026-04-20 03:03:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:46.619166 | orchestrator | 2026-04-20 03:03:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:49.662114 | orchestrator | 2026-04-20 03:03:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:49.663882 | orchestrator | 2026-04-20 03:03:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:49.663979 | orchestrator | 2026-04-20 03:03:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:52.705098 | orchestrator | 2026-04-20 03:03:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:52.706982 | orchestrator | 2026-04-20 03:03:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:52.707041 | orchestrator | 2026-04-20 03:03:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:55.760493 | orchestrator | 2026-04-20 03:03:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:55.761887 | orchestrator | 2026-04-20 03:03:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:55.761930 | orchestrator | 2026-04-20 03:03:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:03:58.809050 | orchestrator | 2026-04-20 03:03:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:03:58.810088 | orchestrator | 2026-04-20 03:03:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:03:58.810126 | orchestrator | 2026-04-20 03:03:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:01.856540 | orchestrator | 2026-04-20 03:04:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:01.858209 | orchestrator | 2026-04-20 03:04:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:01.858288 | orchestrator | 2026-04-20 03:04:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:04.906322 | orchestrator | 2026-04-20 03:04:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:04.908411 | orchestrator | 2026-04-20 03:04:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:04.908479 | orchestrator | 2026-04-20 03:04:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:07.954285 | orchestrator | 2026-04-20 03:04:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:07.955083 | orchestrator | 2026-04-20 03:04:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:07.955117 | orchestrator | 2026-04-20 03:04:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:11.001012 | orchestrator | 2026-04-20 03:04:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:11.002561 | orchestrator | 2026-04-20 03:04:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:11.002604 | orchestrator | 2026-04-20 03:04:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:14.055683 | orchestrator | 2026-04-20 03:04:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:14.057326 | orchestrator | 2026-04-20 03:04:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:14.057387 | orchestrator | 2026-04-20 03:04:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:17.107886 | orchestrator | 2026-04-20 03:04:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:17.110869 | orchestrator | 2026-04-20 03:04:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:17.110946 | orchestrator | 2026-04-20 03:04:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:20.162276 | orchestrator | 2026-04-20 03:04:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:20.165623 | orchestrator | 2026-04-20 03:04:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:20.165694 | orchestrator | 2026-04-20 03:04:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:23.214096 | orchestrator | 2026-04-20 03:04:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:23.215290 | orchestrator | 2026-04-20 03:04:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:23.215337 | orchestrator | 2026-04-20 03:04:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:26.267465 | orchestrator | 2026-04-20 03:04:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:26.269196 | orchestrator | 2026-04-20 03:04:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:26.269257 | orchestrator | 2026-04-20 03:04:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:29.322493 | orchestrator | 2026-04-20 03:04:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:29.323829 | orchestrator | 2026-04-20 03:04:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:29.323948 | orchestrator | 2026-04-20 03:04:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:32.375657 | orchestrator | 2026-04-20 03:04:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:32.376079 | orchestrator | 2026-04-20 03:04:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:32.376103 | orchestrator | 2026-04-20 03:04:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:35.423211 | orchestrator | 2026-04-20 03:04:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:35.424836 | orchestrator | 2026-04-20 03:04:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:35.424888 | orchestrator | 2026-04-20 03:04:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:38.472642 | orchestrator | 2026-04-20 03:04:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:38.474979 | orchestrator | 2026-04-20 03:04:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:38.475002 | orchestrator | 2026-04-20 03:04:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:41.524613 | orchestrator | 2026-04-20 03:04:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:41.525860 | orchestrator | 2026-04-20 03:04:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:41.525897 | orchestrator | 2026-04-20 03:04:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:44.577527 | orchestrator | 2026-04-20 03:04:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:44.580244 | orchestrator | 2026-04-20 03:04:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:44.580536 | orchestrator | 2026-04-20 03:04:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:47.630571 | orchestrator | 2026-04-20 03:04:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:47.632786 | orchestrator | 2026-04-20 03:04:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:47.632881 | orchestrator | 2026-04-20 03:04:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:50.681422 | orchestrator | 2026-04-20 03:04:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:50.683336 | orchestrator | 2026-04-20 03:04:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:50.683407 | orchestrator | 2026-04-20 03:04:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:53.727758 | orchestrator | 2026-04-20 03:04:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:53.729024 | orchestrator | 2026-04-20 03:04:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:53.729057 | orchestrator | 2026-04-20 03:04:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:56.779158 | orchestrator | 2026-04-20 03:04:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:56.780131 | orchestrator | 2026-04-20 03:04:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:56.780185 | orchestrator | 2026-04-20 03:04:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:04:59.829525 | orchestrator | 2026-04-20 03:04:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:04:59.831429 | orchestrator | 2026-04-20 03:04:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:04:59.831470 | orchestrator | 2026-04-20 03:04:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:02.874566 | orchestrator | 2026-04-20 03:05:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:02.876530 | orchestrator | 2026-04-20 03:05:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:02.876572 | orchestrator | 2026-04-20 03:05:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:05.915684 | orchestrator | 2026-04-20 03:05:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:05.917300 | orchestrator | 2026-04-20 03:05:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:05.917357 | orchestrator | 2026-04-20 03:05:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:08.950317 | orchestrator | 2026-04-20 03:05:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:08.951584 | orchestrator | 2026-04-20 03:05:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:08.951625 | orchestrator | 2026-04-20 03:05:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:11.993116 | orchestrator | 2026-04-20 03:05:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:11.994906 | orchestrator | 2026-04-20 03:05:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:11.994974 | orchestrator | 2026-04-20 03:05:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:15.045830 | orchestrator | 2026-04-20 03:05:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:15.048007 | orchestrator | 2026-04-20 03:05:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:15.048099 | orchestrator | 2026-04-20 03:05:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:18.098531 | orchestrator | 2026-04-20 03:05:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:18.100143 | orchestrator | 2026-04-20 03:05:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:18.100203 | orchestrator | 2026-04-20 03:05:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:21.139225 | orchestrator | 2026-04-20 03:05:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:21.141538 | orchestrator | 2026-04-20 03:05:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:21.141585 | orchestrator | 2026-04-20 03:05:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:24.199102 | orchestrator | 2026-04-20 03:05:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:24.200854 | orchestrator | 2026-04-20 03:05:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:24.200905 | orchestrator | 2026-04-20 03:05:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:27.250327 | orchestrator | 2026-04-20 03:05:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:27.252294 | orchestrator | 2026-04-20 03:05:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:27.252463 | orchestrator | 2026-04-20 03:05:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:30.301766 | orchestrator | 2026-04-20 03:05:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:30.302746 | orchestrator | 2026-04-20 03:05:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:30.302835 | orchestrator | 2026-04-20 03:05:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:33.358858 | orchestrator | 2026-04-20 03:05:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:33.360669 | orchestrator | 2026-04-20 03:05:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:33.360727 | orchestrator | 2026-04-20 03:05:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:36.408664 | orchestrator | 2026-04-20 03:05:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:36.410246 | orchestrator | 2026-04-20 03:05:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:36.410300 | orchestrator | 2026-04-20 03:05:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:39.459388 | orchestrator | 2026-04-20 03:05:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:39.460751 | orchestrator | 2026-04-20 03:05:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:39.460814 | orchestrator | 2026-04-20 03:05:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:42.517281 | orchestrator | 2026-04-20 03:05:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:42.519465 | orchestrator | 2026-04-20 03:05:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:42.519636 | orchestrator | 2026-04-20 03:05:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:45.573236 | orchestrator | 2026-04-20 03:05:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:45.574617 | orchestrator | 2026-04-20 03:05:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:45.574677 | orchestrator | 2026-04-20 03:05:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:48.624167 | orchestrator | 2026-04-20 03:05:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:48.624391 | orchestrator | 2026-04-20 03:05:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:48.624416 | orchestrator | 2026-04-20 03:05:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:51.669425 | orchestrator | 2026-04-20 03:05:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:51.671421 | orchestrator | 2026-04-20 03:05:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:51.671545 | orchestrator | 2026-04-20 03:05:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:54.712563 | orchestrator | 2026-04-20 03:05:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:54.713495 | orchestrator | 2026-04-20 03:05:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:54.713597 | orchestrator | 2026-04-20 03:05:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:05:57.770778 | orchestrator | 2026-04-20 03:05:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:05:57.772737 | orchestrator | 2026-04-20 03:05:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:05:57.772808 | orchestrator | 2026-04-20 03:05:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:00.820110 | orchestrator | 2026-04-20 03:06:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:00.821832 | orchestrator | 2026-04-20 03:06:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:00.821901 | orchestrator | 2026-04-20 03:06:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:03.873916 | orchestrator | 2026-04-20 03:06:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:03.875792 | orchestrator | 2026-04-20 03:06:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:03.875832 | orchestrator | 2026-04-20 03:06:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:06.920207 | orchestrator | 2026-04-20 03:06:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:06.921590 | orchestrator | 2026-04-20 03:06:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:06.921665 | orchestrator | 2026-04-20 03:06:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:09.969333 | orchestrator | 2026-04-20 03:06:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:09.970652 | orchestrator | 2026-04-20 03:06:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:09.970682 | orchestrator | 2026-04-20 03:06:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:13.016489 | orchestrator | 2026-04-20 03:06:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:13.017658 | orchestrator | 2026-04-20 03:06:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:13.017697 | orchestrator | 2026-04-20 03:06:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:16.072464 | orchestrator | 2026-04-20 03:06:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:16.074425 | orchestrator | 2026-04-20 03:06:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:16.074617 | orchestrator | 2026-04-20 03:06:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:19.124731 | orchestrator | 2026-04-20 03:06:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:19.127293 | orchestrator | 2026-04-20 03:06:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:19.127390 | orchestrator | 2026-04-20 03:06:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:22.179737 | orchestrator | 2026-04-20 03:06:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:22.182716 | orchestrator | 2026-04-20 03:06:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:22.182818 | orchestrator | 2026-04-20 03:06:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:25.229453 | orchestrator | 2026-04-20 03:06:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:25.231562 | orchestrator | 2026-04-20 03:06:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:25.231599 | orchestrator | 2026-04-20 03:06:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:28.290956 | orchestrator | 2026-04-20 03:06:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:28.292602 | orchestrator | 2026-04-20 03:06:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:28.292686 | orchestrator | 2026-04-20 03:06:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:31.345261 | orchestrator | 2026-04-20 03:06:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:31.347178 | orchestrator | 2026-04-20 03:06:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:31.347239 | orchestrator | 2026-04-20 03:06:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:34.399310 | orchestrator | 2026-04-20 03:06:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:34.399787 | orchestrator | 2026-04-20 03:06:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:34.399834 | orchestrator | 2026-04-20 03:06:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:37.458170 | orchestrator | 2026-04-20 03:06:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:37.459486 | orchestrator | 2026-04-20 03:06:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:37.459621 | orchestrator | 2026-04-20 03:06:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:40.512806 | orchestrator | 2026-04-20 03:06:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:40.516768 | orchestrator | 2026-04-20 03:06:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:40.516903 | orchestrator | 2026-04-20 03:06:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:43.571520 | orchestrator | 2026-04-20 03:06:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:43.573960 | orchestrator | 2026-04-20 03:06:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:43.574089 | orchestrator | 2026-04-20 03:06:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:46.622473 | orchestrator | 2026-04-20 03:06:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:46.623610 | orchestrator | 2026-04-20 03:06:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:46.623663 | orchestrator | 2026-04-20 03:06:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:49.678453 | orchestrator | 2026-04-20 03:06:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:49.680145 | orchestrator | 2026-04-20 03:06:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:49.680299 | orchestrator | 2026-04-20 03:06:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:52.733257 | orchestrator | 2026-04-20 03:06:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:52.736006 | orchestrator | 2026-04-20 03:06:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:52.736094 | orchestrator | 2026-04-20 03:06:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:55.775124 | orchestrator | 2026-04-20 03:06:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:55.776217 | orchestrator | 2026-04-20 03:06:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:55.776271 | orchestrator | 2026-04-20 03:06:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:06:58.822771 | orchestrator | 2026-04-20 03:06:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:06:58.824075 | orchestrator | 2026-04-20 03:06:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:06:58.824142 | orchestrator | 2026-04-20 03:06:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:01.869654 | orchestrator | 2026-04-20 03:07:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:01.870107 | orchestrator | 2026-04-20 03:07:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:01.870200 | orchestrator | 2026-04-20 03:07:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:04.915229 | orchestrator | 2026-04-20 03:07:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:04.916655 | orchestrator | 2026-04-20 03:07:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:04.916773 | orchestrator | 2026-04-20 03:07:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:07.963749 | orchestrator | 2026-04-20 03:07:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:07.966324 | orchestrator | 2026-04-20 03:07:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:07.966471 | orchestrator | 2026-04-20 03:07:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:11.023611 | orchestrator | 2026-04-20 03:07:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:11.025165 | orchestrator | 2026-04-20 03:07:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:11.025205 | orchestrator | 2026-04-20 03:07:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:14.083682 | orchestrator | 2026-04-20 03:07:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:14.084533 | orchestrator | 2026-04-20 03:07:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:14.084638 | orchestrator | 2026-04-20 03:07:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:17.136596 | orchestrator | 2026-04-20 03:07:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:17.138896 | orchestrator | 2026-04-20 03:07:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:17.139011 | orchestrator | 2026-04-20 03:07:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:20.187184 | orchestrator | 2026-04-20 03:07:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:20.189427 | orchestrator | 2026-04-20 03:07:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:20.189595 | orchestrator | 2026-04-20 03:07:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:23.233277 | orchestrator | 2026-04-20 03:07:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:23.235088 | orchestrator | 2026-04-20 03:07:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:23.235139 | orchestrator | 2026-04-20 03:07:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:26.279292 | orchestrator | 2026-04-20 03:07:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:26.280004 | orchestrator | 2026-04-20 03:07:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:26.280126 | orchestrator | 2026-04-20 03:07:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:29.330780 | orchestrator | 2026-04-20 03:07:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:29.332698 | orchestrator | 2026-04-20 03:07:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:29.332749 | orchestrator | 2026-04-20 03:07:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:32.381439 | orchestrator | 2026-04-20 03:07:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:32.383578 | orchestrator | 2026-04-20 03:07:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:32.383636 | orchestrator | 2026-04-20 03:07:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:35.432152 | orchestrator | 2026-04-20 03:07:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:35.432231 | orchestrator | 2026-04-20 03:07:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:35.432276 | orchestrator | 2026-04-20 03:07:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:38.469916 | orchestrator | 2026-04-20 03:07:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:38.471881 | orchestrator | 2026-04-20 03:07:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:38.471921 | orchestrator | 2026-04-20 03:07:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:41.522418 | orchestrator | 2026-04-20 03:07:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:41.524111 | orchestrator | 2026-04-20 03:07:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:41.524447 | orchestrator | 2026-04-20 03:07:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:44.567891 | orchestrator | 2026-04-20 03:07:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:44.570565 | orchestrator | 2026-04-20 03:07:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:44.570646 | orchestrator | 2026-04-20 03:07:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:47.614560 | orchestrator | 2026-04-20 03:07:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:47.617242 | orchestrator | 2026-04-20 03:07:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:47.617342 | orchestrator | 2026-04-20 03:07:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:50.664685 | orchestrator | 2026-04-20 03:07:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:50.669590 | orchestrator | 2026-04-20 03:07:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:50.669758 | orchestrator | 2026-04-20 03:07:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:53.726881 | orchestrator | 2026-04-20 03:07:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:53.730458 | orchestrator | 2026-04-20 03:07:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:53.730540 | orchestrator | 2026-04-20 03:07:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:56.783275 | orchestrator | 2026-04-20 03:07:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:56.786489 | orchestrator | 2026-04-20 03:07:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:56.786561 | orchestrator | 2026-04-20 03:07:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:07:59.840920 | orchestrator | 2026-04-20 03:07:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:07:59.845471 | orchestrator | 2026-04-20 03:07:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:07:59.845547 | orchestrator | 2026-04-20 03:07:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:02.900823 | orchestrator | 2026-04-20 03:08:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:02.903026 | orchestrator | 2026-04-20 03:08:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:02.903092 | orchestrator | 2026-04-20 03:08:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:05.953216 | orchestrator | 2026-04-20 03:08:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:05.954879 | orchestrator | 2026-04-20 03:08:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:05.954947 | orchestrator | 2026-04-20 03:08:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:09.005703 | orchestrator | 2026-04-20 03:08:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:09.007161 | orchestrator | 2026-04-20 03:08:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:09.007244 | orchestrator | 2026-04-20 03:08:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:12.060373 | orchestrator | 2026-04-20 03:08:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:12.062855 | orchestrator | 2026-04-20 03:08:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:12.062909 | orchestrator | 2026-04-20 03:08:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:15.106635 | orchestrator | 2026-04-20 03:08:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:15.107729 | orchestrator | 2026-04-20 03:08:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:15.107853 | orchestrator | 2026-04-20 03:08:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:18.152530 | orchestrator | 2026-04-20 03:08:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:18.152811 | orchestrator | 2026-04-20 03:08:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:18.152833 | orchestrator | 2026-04-20 03:08:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:21.196621 | orchestrator | 2026-04-20 03:08:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:21.199136 | orchestrator | 2026-04-20 03:08:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:21.199219 | orchestrator | 2026-04-20 03:08:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:24.246885 | orchestrator | 2026-04-20 03:08:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:24.248206 | orchestrator | 2026-04-20 03:08:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:24.248279 | orchestrator | 2026-04-20 03:08:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:27.305292 | orchestrator | 2026-04-20 03:08:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:27.306406 | orchestrator | 2026-04-20 03:08:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:27.306457 | orchestrator | 2026-04-20 03:08:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:30.356013 | orchestrator | 2026-04-20 03:08:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:30.357703 | orchestrator | 2026-04-20 03:08:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:30.357779 | orchestrator | 2026-04-20 03:08:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:33.409051 | orchestrator | 2026-04-20 03:08:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:33.410117 | orchestrator | 2026-04-20 03:08:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:33.410184 | orchestrator | 2026-04-20 03:08:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:36.457900 | orchestrator | 2026-04-20 03:08:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:36.458864 | orchestrator | 2026-04-20 03:08:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:36.458908 | orchestrator | 2026-04-20 03:08:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:39.516088 | orchestrator | 2026-04-20 03:08:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:39.517906 | orchestrator | 2026-04-20 03:08:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:39.517943 | orchestrator | 2026-04-20 03:08:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:42.570087 | orchestrator | 2026-04-20 03:08:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:42.571607 | orchestrator | 2026-04-20 03:08:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:42.571662 | orchestrator | 2026-04-20 03:08:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:45.617564 | orchestrator | 2026-04-20 03:08:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:45.620430 | orchestrator | 2026-04-20 03:08:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:45.620490 | orchestrator | 2026-04-20 03:08:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:48.670154 | orchestrator | 2026-04-20 03:08:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:48.671456 | orchestrator | 2026-04-20 03:08:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:48.671572 | orchestrator | 2026-04-20 03:08:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:51.718189 | orchestrator | 2026-04-20 03:08:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:51.719796 | orchestrator | 2026-04-20 03:08:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:51.720165 | orchestrator | 2026-04-20 03:08:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:54.763539 | orchestrator | 2026-04-20 03:08:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:54.764530 | orchestrator | 2026-04-20 03:08:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:54.764605 | orchestrator | 2026-04-20 03:08:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:08:57.819583 | orchestrator | 2026-04-20 03:08:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:08:57.819868 | orchestrator | 2026-04-20 03:08:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:08:57.820006 | orchestrator | 2026-04-20 03:08:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:00.869938 | orchestrator | 2026-04-20 03:09:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:00.871669 | orchestrator | 2026-04-20 03:09:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:00.871835 | orchestrator | 2026-04-20 03:09:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:03.925990 | orchestrator | 2026-04-20 03:09:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:03.927483 | orchestrator | 2026-04-20 03:09:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:03.927526 | orchestrator | 2026-04-20 03:09:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:06.973170 | orchestrator | 2026-04-20 03:09:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:06.973549 | orchestrator | 2026-04-20 03:09:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:06.973575 | orchestrator | 2026-04-20 03:09:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:10.020184 | orchestrator | 2026-04-20 03:09:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:10.020975 | orchestrator | 2026-04-20 03:09:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:10.021339 | orchestrator | 2026-04-20 03:09:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:13.067452 | orchestrator | 2026-04-20 03:09:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:13.068317 | orchestrator | 2026-04-20 03:09:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:13.068336 | orchestrator | 2026-04-20 03:09:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:16.121486 | orchestrator | 2026-04-20 03:09:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:16.122996 | orchestrator | 2026-04-20 03:09:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:16.123077 | orchestrator | 2026-04-20 03:09:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:19.174809 | orchestrator | 2026-04-20 03:09:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:19.175754 | orchestrator | 2026-04-20 03:09:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:19.175787 | orchestrator | 2026-04-20 03:09:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:22.224422 | orchestrator | 2026-04-20 03:09:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:22.226253 | orchestrator | 2026-04-20 03:09:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:22.226432 | orchestrator | 2026-04-20 03:09:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:25.277141 | orchestrator | 2026-04-20 03:09:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:25.277267 | orchestrator | 2026-04-20 03:09:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:25.277275 | orchestrator | 2026-04-20 03:09:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:28.322710 | orchestrator | 2026-04-20 03:09:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:28.324415 | orchestrator | 2026-04-20 03:09:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:28.324475 | orchestrator | 2026-04-20 03:09:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:31.368454 | orchestrator | 2026-04-20 03:09:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:31.370323 | orchestrator | 2026-04-20 03:09:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:31.370447 | orchestrator | 2026-04-20 03:09:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:34.425566 | orchestrator | 2026-04-20 03:09:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:34.426392 | orchestrator | 2026-04-20 03:09:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:34.426418 | orchestrator | 2026-04-20 03:09:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:37.475329 | orchestrator | 2026-04-20 03:09:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:37.477124 | orchestrator | 2026-04-20 03:09:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:37.477159 | orchestrator | 2026-04-20 03:09:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:40.523507 | orchestrator | 2026-04-20 03:09:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:40.525749 | orchestrator | 2026-04-20 03:09:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:40.525808 | orchestrator | 2026-04-20 03:09:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:43.573555 | orchestrator | 2026-04-20 03:09:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:43.575440 | orchestrator | 2026-04-20 03:09:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:43.575519 | orchestrator | 2026-04-20 03:09:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:46.626406 | orchestrator | 2026-04-20 03:09:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:46.629550 | orchestrator | 2026-04-20 03:09:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:46.629624 | orchestrator | 2026-04-20 03:09:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:49.680755 | orchestrator | 2026-04-20 03:09:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:49.682247 | orchestrator | 2026-04-20 03:09:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:49.682432 | orchestrator | 2026-04-20 03:09:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:52.734922 | orchestrator | 2026-04-20 03:09:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:52.735626 | orchestrator | 2026-04-20 03:09:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:52.735683 | orchestrator | 2026-04-20 03:09:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:55.782255 | orchestrator | 2026-04-20 03:09:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:55.783088 | orchestrator | 2026-04-20 03:09:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:55.783211 | orchestrator | 2026-04-20 03:09:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:09:58.828958 | orchestrator | 2026-04-20 03:09:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:09:58.829929 | orchestrator | 2026-04-20 03:09:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:09:58.830005 | orchestrator | 2026-04-20 03:09:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:01.878135 | orchestrator | 2026-04-20 03:10:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:01.879699 | orchestrator | 2026-04-20 03:10:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:01.879797 | orchestrator | 2026-04-20 03:10:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:04.916196 | orchestrator | 2026-04-20 03:10:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:04.917470 | orchestrator | 2026-04-20 03:10:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:04.917509 | orchestrator | 2026-04-20 03:10:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:07.959078 | orchestrator | 2026-04-20 03:10:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:07.959789 | orchestrator | 2026-04-20 03:10:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:07.959934 | orchestrator | 2026-04-20 03:10:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:11.008588 | orchestrator | 2026-04-20 03:10:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:11.009302 | orchestrator | 2026-04-20 03:10:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:11.009352 | orchestrator | 2026-04-20 03:10:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:14.055170 | orchestrator | 2026-04-20 03:10:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:14.056833 | orchestrator | 2026-04-20 03:10:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:14.056895 | orchestrator | 2026-04-20 03:10:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:17.108444 | orchestrator | 2026-04-20 03:10:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:17.110963 | orchestrator | 2026-04-20 03:10:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:17.111039 | orchestrator | 2026-04-20 03:10:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:20.160471 | orchestrator | 2026-04-20 03:10:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:20.163056 | orchestrator | 2026-04-20 03:10:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:20.163152 | orchestrator | 2026-04-20 03:10:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:23.215475 | orchestrator | 2026-04-20 03:10:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:23.216955 | orchestrator | 2026-04-20 03:10:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:23.217076 | orchestrator | 2026-04-20 03:10:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:26.263377 | orchestrator | 2026-04-20 03:10:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:26.263539 | orchestrator | 2026-04-20 03:10:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:26.263861 | orchestrator | 2026-04-20 03:10:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:29.312392 | orchestrator | 2026-04-20 03:10:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:29.312539 | orchestrator | 2026-04-20 03:10:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:29.312556 | orchestrator | 2026-04-20 03:10:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:32.353246 | orchestrator | 2026-04-20 03:10:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:32.353387 | orchestrator | 2026-04-20 03:10:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:32.353400 | orchestrator | 2026-04-20 03:10:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:35.396551 | orchestrator | 2026-04-20 03:10:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:35.398221 | orchestrator | 2026-04-20 03:10:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:35.398277 | orchestrator | 2026-04-20 03:10:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:38.450442 | orchestrator | 2026-04-20 03:10:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:38.451347 | orchestrator | 2026-04-20 03:10:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:38.451532 | orchestrator | 2026-04-20 03:10:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:41.496237 | orchestrator | 2026-04-20 03:10:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:41.498092 | orchestrator | 2026-04-20 03:10:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:41.498175 | orchestrator | 2026-04-20 03:10:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:44.539117 | orchestrator | 2026-04-20 03:10:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:44.540505 | orchestrator | 2026-04-20 03:10:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:44.540687 | orchestrator | 2026-04-20 03:10:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:47.584855 | orchestrator | 2026-04-20 03:10:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:47.586396 | orchestrator | 2026-04-20 03:10:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:47.586458 | orchestrator | 2026-04-20 03:10:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:50.634792 | orchestrator | 2026-04-20 03:10:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:50.637635 | orchestrator | 2026-04-20 03:10:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:50.637704 | orchestrator | 2026-04-20 03:10:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:53.689419 | orchestrator | 2026-04-20 03:10:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:53.690334 | orchestrator | 2026-04-20 03:10:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:53.690369 | orchestrator | 2026-04-20 03:10:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:56.746212 | orchestrator | 2026-04-20 03:10:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:56.747866 | orchestrator | 2026-04-20 03:10:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:56.747992 | orchestrator | 2026-04-20 03:10:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:10:59.798313 | orchestrator | 2026-04-20 03:10:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:10:59.798408 | orchestrator | 2026-04-20 03:10:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:10:59.798419 | orchestrator | 2026-04-20 03:10:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:02.836543 | orchestrator | 2026-04-20 03:11:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:02.837773 | orchestrator | 2026-04-20 03:11:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:02.838130 | orchestrator | 2026-04-20 03:11:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:05.897529 | orchestrator | 2026-04-20 03:11:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:05.897690 | orchestrator | 2026-04-20 03:11:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:05.898964 | orchestrator | 2026-04-20 03:11:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:08.949723 | orchestrator | 2026-04-20 03:11:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:08.950942 | orchestrator | 2026-04-20 03:11:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:08.950984 | orchestrator | 2026-04-20 03:11:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:11.997514 | orchestrator | 2026-04-20 03:11:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:11.999670 | orchestrator | 2026-04-20 03:11:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:11.999874 | orchestrator | 2026-04-20 03:11:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:15.036413 | orchestrator | 2026-04-20 03:11:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:15.037734 | orchestrator | 2026-04-20 03:11:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:15.037798 | orchestrator | 2026-04-20 03:11:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:18.075724 | orchestrator | 2026-04-20 03:11:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:18.076837 | orchestrator | 2026-04-20 03:11:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:18.076888 | orchestrator | 2026-04-20 03:11:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:21.122240 | orchestrator | 2026-04-20 03:11:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:21.122666 | orchestrator | 2026-04-20 03:11:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:21.122770 | orchestrator | 2026-04-20 03:11:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:24.168325 | orchestrator | 2026-04-20 03:11:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:24.172188 | orchestrator | 2026-04-20 03:11:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:24.172259 | orchestrator | 2026-04-20 03:11:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:27.217036 | orchestrator | 2026-04-20 03:11:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:27.217671 | orchestrator | 2026-04-20 03:11:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:27.218306 | orchestrator | 2026-04-20 03:11:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:30.265832 | orchestrator | 2026-04-20 03:11:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:30.266737 | orchestrator | 2026-04-20 03:11:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:30.266781 | orchestrator | 2026-04-20 03:11:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:33.315653 | orchestrator | 2026-04-20 03:11:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:33.321679 | orchestrator | 2026-04-20 03:11:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:33.321794 | orchestrator | 2026-04-20 03:11:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:36.371315 | orchestrator | 2026-04-20 03:11:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:36.373502 | orchestrator | 2026-04-20 03:11:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:36.373581 | orchestrator | 2026-04-20 03:11:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:39.424063 | orchestrator | 2026-04-20 03:11:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:39.424738 | orchestrator | 2026-04-20 03:11:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:39.424773 | orchestrator | 2026-04-20 03:11:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:42.481021 | orchestrator | 2026-04-20 03:11:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:42.484001 | orchestrator | 2026-04-20 03:11:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:42.484066 | orchestrator | 2026-04-20 03:11:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:45.524965 | orchestrator | 2026-04-20 03:11:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:45.528071 | orchestrator | 2026-04-20 03:11:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:45.528128 | orchestrator | 2026-04-20 03:11:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:48.573875 | orchestrator | 2026-04-20 03:11:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:48.575280 | orchestrator | 2026-04-20 03:11:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:48.575367 | orchestrator | 2026-04-20 03:11:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:51.622353 | orchestrator | 2026-04-20 03:11:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:51.624427 | orchestrator | 2026-04-20 03:11:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:51.624479 | orchestrator | 2026-04-20 03:11:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:54.680417 | orchestrator | 2026-04-20 03:11:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:54.682578 | orchestrator | 2026-04-20 03:11:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:54.682664 | orchestrator | 2026-04-20 03:11:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:11:57.731824 | orchestrator | 2026-04-20 03:11:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:11:57.734136 | orchestrator | 2026-04-20 03:11:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:11:57.734196 | orchestrator | 2026-04-20 03:11:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:00.777200 | orchestrator | 2026-04-20 03:12:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:00.780262 | orchestrator | 2026-04-20 03:12:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:00.780346 | orchestrator | 2026-04-20 03:12:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:03.833254 | orchestrator | 2026-04-20 03:12:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:03.835190 | orchestrator | 2026-04-20 03:12:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:03.835283 | orchestrator | 2026-04-20 03:12:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:06.882959 | orchestrator | 2026-04-20 03:12:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:06.886443 | orchestrator | 2026-04-20 03:12:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:06.886583 | orchestrator | 2026-04-20 03:12:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:09.947842 | orchestrator | 2026-04-20 03:12:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:09.950588 | orchestrator | 2026-04-20 03:12:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:09.950664 | orchestrator | 2026-04-20 03:12:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:13.001023 | orchestrator | 2026-04-20 03:12:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:13.004289 | orchestrator | 2026-04-20 03:12:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:13.004407 | orchestrator | 2026-04-20 03:12:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:16.061145 | orchestrator | 2026-04-20 03:12:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:16.061227 | orchestrator | 2026-04-20 03:12:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:16.061236 | orchestrator | 2026-04-20 03:12:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:19.108574 | orchestrator | 2026-04-20 03:12:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:19.111949 | orchestrator | 2026-04-20 03:12:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:19.112018 | orchestrator | 2026-04-20 03:12:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:22.168819 | orchestrator | 2026-04-20 03:12:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:22.171202 | orchestrator | 2026-04-20 03:12:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:22.171262 | orchestrator | 2026-04-20 03:12:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:25.218254 | orchestrator | 2026-04-20 03:12:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:25.219656 | orchestrator | 2026-04-20 03:12:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:25.219692 | orchestrator | 2026-04-20 03:12:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:28.282633 | orchestrator | 2026-04-20 03:12:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:28.285814 | orchestrator | 2026-04-20 03:12:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:28.285875 | orchestrator | 2026-04-20 03:12:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:31.334669 | orchestrator | 2026-04-20 03:12:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:31.336866 | orchestrator | 2026-04-20 03:12:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:31.336934 | orchestrator | 2026-04-20 03:12:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:34.386241 | orchestrator | 2026-04-20 03:12:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:34.387605 | orchestrator | 2026-04-20 03:12:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:34.387641 | orchestrator | 2026-04-20 03:12:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:37.438710 | orchestrator | 2026-04-20 03:12:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:37.442012 | orchestrator | 2026-04-20 03:12:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:37.442242 | orchestrator | 2026-04-20 03:12:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:40.501492 | orchestrator | 2026-04-20 03:12:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:40.506482 | orchestrator | 2026-04-20 03:12:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:40.506612 | orchestrator | 2026-04-20 03:12:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:43.563270 | orchestrator | 2026-04-20 03:12:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:43.565385 | orchestrator | 2026-04-20 03:12:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:43.565960 | orchestrator | 2026-04-20 03:12:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:46.623363 | orchestrator | 2026-04-20 03:12:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:46.623894 | orchestrator | 2026-04-20 03:12:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:46.624199 | orchestrator | 2026-04-20 03:12:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:49.681264 | orchestrator | 2026-04-20 03:12:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:49.682294 | orchestrator | 2026-04-20 03:12:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:49.682343 | orchestrator | 2026-04-20 03:12:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:52.738908 | orchestrator | 2026-04-20 03:12:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:52.739933 | orchestrator | 2026-04-20 03:12:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:52.740012 | orchestrator | 2026-04-20 03:12:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:55.786381 | orchestrator | 2026-04-20 03:12:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:55.787575 | orchestrator | 2026-04-20 03:12:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:55.787632 | orchestrator | 2026-04-20 03:12:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:12:58.842251 | orchestrator | 2026-04-20 03:12:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:12:58.845589 | orchestrator | 2026-04-20 03:12:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:12:58.845663 | orchestrator | 2026-04-20 03:12:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:01.895565 | orchestrator | 2026-04-20 03:13:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:01.897411 | orchestrator | 2026-04-20 03:13:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:01.897509 | orchestrator | 2026-04-20 03:13:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:04.948658 | orchestrator | 2026-04-20 03:13:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:04.949737 | orchestrator | 2026-04-20 03:13:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:04.949799 | orchestrator | 2026-04-20 03:13:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:08.001375 | orchestrator | 2026-04-20 03:13:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:08.003688 | orchestrator | 2026-04-20 03:13:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:08.003765 | orchestrator | 2026-04-20 03:13:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:11.051630 | orchestrator | 2026-04-20 03:13:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:11.053685 | orchestrator | 2026-04-20 03:13:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:11.053790 | orchestrator | 2026-04-20 03:13:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:14.110177 | orchestrator | 2026-04-20 03:13:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:14.111564 | orchestrator | 2026-04-20 03:13:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:14.111607 | orchestrator | 2026-04-20 03:13:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:17.170780 | orchestrator | 2026-04-20 03:13:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:17.172109 | orchestrator | 2026-04-20 03:13:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:17.172156 | orchestrator | 2026-04-20 03:13:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:20.217596 | orchestrator | 2026-04-20 03:13:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:20.218082 | orchestrator | 2026-04-20 03:13:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:20.218229 | orchestrator | 2026-04-20 03:13:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:23.271687 | orchestrator | 2026-04-20 03:13:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:23.272983 | orchestrator | 2026-04-20 03:13:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:23.273032 | orchestrator | 2026-04-20 03:13:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:26.324312 | orchestrator | 2026-04-20 03:13:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:26.328158 | orchestrator | 2026-04-20 03:13:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:26.328262 | orchestrator | 2026-04-20 03:13:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:29.378342 | orchestrator | 2026-04-20 03:13:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:29.380275 | orchestrator | 2026-04-20 03:13:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:29.380371 | orchestrator | 2026-04-20 03:13:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:32.428303 | orchestrator | 2026-04-20 03:13:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:32.429546 | orchestrator | 2026-04-20 03:13:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:32.429597 | orchestrator | 2026-04-20 03:13:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:35.475277 | orchestrator | 2026-04-20 03:13:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:35.477928 | orchestrator | 2026-04-20 03:13:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:35.477967 | orchestrator | 2026-04-20 03:13:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:38.526333 | orchestrator | 2026-04-20 03:13:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:38.526680 | orchestrator | 2026-04-20 03:13:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:38.526719 | orchestrator | 2026-04-20 03:13:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:41.573508 | orchestrator | 2026-04-20 03:13:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:41.575363 | orchestrator | 2026-04-20 03:13:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:41.575474 | orchestrator | 2026-04-20 03:13:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:44.623768 | orchestrator | 2026-04-20 03:13:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:44.624927 | orchestrator | 2026-04-20 03:13:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:44.624967 | orchestrator | 2026-04-20 03:13:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:47.676149 | orchestrator | 2026-04-20 03:13:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:47.677933 | orchestrator | 2026-04-20 03:13:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:47.678212 | orchestrator | 2026-04-20 03:13:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:50.725217 | orchestrator | 2026-04-20 03:13:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:50.726984 | orchestrator | 2026-04-20 03:13:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:50.727080 | orchestrator | 2026-04-20 03:13:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:53.774176 | orchestrator | 2026-04-20 03:13:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:53.777277 | orchestrator | 2026-04-20 03:13:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:53.777428 | orchestrator | 2026-04-20 03:13:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:56.829900 | orchestrator | 2026-04-20 03:13:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:56.832170 | orchestrator | 2026-04-20 03:13:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:56.832223 | orchestrator | 2026-04-20 03:13:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:13:59.876106 | orchestrator | 2026-04-20 03:13:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:13:59.877131 | orchestrator | 2026-04-20 03:13:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:13:59.877175 | orchestrator | 2026-04-20 03:13:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:02.920597 | orchestrator | 2026-04-20 03:14:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:02.921508 | orchestrator | 2026-04-20 03:14:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:02.921560 | orchestrator | 2026-04-20 03:14:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:05.965797 | orchestrator | 2026-04-20 03:14:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:05.969433 | orchestrator | 2026-04-20 03:14:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:05.969522 | orchestrator | 2026-04-20 03:14:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:09.023047 | orchestrator | 2026-04-20 03:14:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:09.025932 | orchestrator | 2026-04-20 03:14:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:09.026190 | orchestrator | 2026-04-20 03:14:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:12.075557 | orchestrator | 2026-04-20 03:14:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:12.078730 | orchestrator | 2026-04-20 03:14:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:12.078792 | orchestrator | 2026-04-20 03:14:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:15.125128 | orchestrator | 2026-04-20 03:14:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:15.126403 | orchestrator | 2026-04-20 03:14:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:15.126455 | orchestrator | 2026-04-20 03:14:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:18.175019 | orchestrator | 2026-04-20 03:14:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:18.176478 | orchestrator | 2026-04-20 03:14:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:18.176567 | orchestrator | 2026-04-20 03:14:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:21.226990 | orchestrator | 2026-04-20 03:14:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:21.229267 | orchestrator | 2026-04-20 03:14:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:21.229332 | orchestrator | 2026-04-20 03:14:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:24.282539 | orchestrator | 2026-04-20 03:14:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:24.286783 | orchestrator | 2026-04-20 03:14:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:24.286858 | orchestrator | 2026-04-20 03:14:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:27.339002 | orchestrator | 2026-04-20 03:14:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:27.342291 | orchestrator | 2026-04-20 03:14:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:27.342456 | orchestrator | 2026-04-20 03:14:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:30.398690 | orchestrator | 2026-04-20 03:14:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:30.401687 | orchestrator | 2026-04-20 03:14:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:30.401757 | orchestrator | 2026-04-20 03:14:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:33.459518 | orchestrator | 2026-04-20 03:14:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:33.460311 | orchestrator | 2026-04-20 03:14:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:33.460459 | orchestrator | 2026-04-20 03:14:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:36.509945 | orchestrator | 2026-04-20 03:14:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:36.511100 | orchestrator | 2026-04-20 03:14:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:36.511165 | orchestrator | 2026-04-20 03:14:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:39.559492 | orchestrator | 2026-04-20 03:14:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:39.561200 | orchestrator | 2026-04-20 03:14:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:39.561355 | orchestrator | 2026-04-20 03:14:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:42.612993 | orchestrator | 2026-04-20 03:14:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:42.614927 | orchestrator | 2026-04-20 03:14:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:42.614981 | orchestrator | 2026-04-20 03:14:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:45.659287 | orchestrator | 2026-04-20 03:14:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:45.659463 | orchestrator | 2026-04-20 03:14:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:45.659475 | orchestrator | 2026-04-20 03:14:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:48.712963 | orchestrator | 2026-04-20 03:14:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:48.715452 | orchestrator | 2026-04-20 03:14:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:48.715501 | orchestrator | 2026-04-20 03:14:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:51.760765 | orchestrator | 2026-04-20 03:14:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:51.762155 | orchestrator | 2026-04-20 03:14:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:51.762213 | orchestrator | 2026-04-20 03:14:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:54.809601 | orchestrator | 2026-04-20 03:14:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:54.810312 | orchestrator | 2026-04-20 03:14:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:54.810408 | orchestrator | 2026-04-20 03:14:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:14:57.863379 | orchestrator | 2026-04-20 03:14:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:14:57.863743 | orchestrator | 2026-04-20 03:14:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:14:57.863786 | orchestrator | 2026-04-20 03:14:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:00.907432 | orchestrator | 2026-04-20 03:15:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:00.909728 | orchestrator | 2026-04-20 03:15:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:00.909795 | orchestrator | 2026-04-20 03:15:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:03.959485 | orchestrator | 2026-04-20 03:15:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:03.960900 | orchestrator | 2026-04-20 03:15:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:03.961327 | orchestrator | 2026-04-20 03:15:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:07.010223 | orchestrator | 2026-04-20 03:15:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:07.011233 | orchestrator | 2026-04-20 03:15:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:07.011264 | orchestrator | 2026-04-20 03:15:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:10.059918 | orchestrator | 2026-04-20 03:15:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:10.060297 | orchestrator | 2026-04-20 03:15:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:10.060399 | orchestrator | 2026-04-20 03:15:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:13.113284 | orchestrator | 2026-04-20 03:15:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:13.116205 | orchestrator | 2026-04-20 03:15:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:13.116467 | orchestrator | 2026-04-20 03:15:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:16.170893 | orchestrator | 2026-04-20 03:15:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:16.171803 | orchestrator | 2026-04-20 03:15:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:16.172025 | orchestrator | 2026-04-20 03:15:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:19.226807 | orchestrator | 2026-04-20 03:15:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:19.228740 | orchestrator | 2026-04-20 03:15:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:19.228826 | orchestrator | 2026-04-20 03:15:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:22.283913 | orchestrator | 2026-04-20 03:15:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:22.285791 | orchestrator | 2026-04-20 03:15:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:22.285842 | orchestrator | 2026-04-20 03:15:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:25.330393 | orchestrator | 2026-04-20 03:15:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:25.332490 | orchestrator | 2026-04-20 03:15:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:25.332566 | orchestrator | 2026-04-20 03:15:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:28.381823 | orchestrator | 2026-04-20 03:15:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:28.384478 | orchestrator | 2026-04-20 03:15:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:28.384719 | orchestrator | 2026-04-20 03:15:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:31.431277 | orchestrator | 2026-04-20 03:15:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:31.432617 | orchestrator | 2026-04-20 03:15:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:31.432660 | orchestrator | 2026-04-20 03:15:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:34.487610 | orchestrator | 2026-04-20 03:15:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:34.488616 | orchestrator | 2026-04-20 03:15:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:34.488824 | orchestrator | 2026-04-20 03:15:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:37.541240 | orchestrator | 2026-04-20 03:15:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:37.543712 | orchestrator | 2026-04-20 03:15:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:37.543786 | orchestrator | 2026-04-20 03:15:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:40.595672 | orchestrator | 2026-04-20 03:15:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:40.598069 | orchestrator | 2026-04-20 03:15:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:40.598123 | orchestrator | 2026-04-20 03:15:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:43.653395 | orchestrator | 2026-04-20 03:15:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:43.655556 | orchestrator | 2026-04-20 03:15:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:43.655665 | orchestrator | 2026-04-20 03:15:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:46.705819 | orchestrator | 2026-04-20 03:15:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:46.706802 | orchestrator | 2026-04-20 03:15:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:46.706845 | orchestrator | 2026-04-20 03:15:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:49.755104 | orchestrator | 2026-04-20 03:15:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:49.756396 | orchestrator | 2026-04-20 03:15:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:49.756461 | orchestrator | 2026-04-20 03:15:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:52.804521 | orchestrator | 2026-04-20 03:15:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:52.807597 | orchestrator | 2026-04-20 03:15:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:52.807700 | orchestrator | 2026-04-20 03:15:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:55.858638 | orchestrator | 2026-04-20 03:15:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:55.860623 | orchestrator | 2026-04-20 03:15:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:55.860843 | orchestrator | 2026-04-20 03:15:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:15:58.905748 | orchestrator | 2026-04-20 03:15:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:15:58.907065 | orchestrator | 2026-04-20 03:15:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:15:58.907221 | orchestrator | 2026-04-20 03:15:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:01.952475 | orchestrator | 2026-04-20 03:16:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:01.954930 | orchestrator | 2026-04-20 03:16:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:01.955024 | orchestrator | 2026-04-20 03:16:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:05.002249 | orchestrator | 2026-04-20 03:16:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:05.003538 | orchestrator | 2026-04-20 03:16:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:05.003582 | orchestrator | 2026-04-20 03:16:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:08.048396 | orchestrator | 2026-04-20 03:16:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:08.049991 | orchestrator | 2026-04-20 03:16:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:08.050091 | orchestrator | 2026-04-20 03:16:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:11.093163 | orchestrator | 2026-04-20 03:16:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:11.093886 | orchestrator | 2026-04-20 03:16:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:11.093929 | orchestrator | 2026-04-20 03:16:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:14.145124 | orchestrator | 2026-04-20 03:16:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:14.146387 | orchestrator | 2026-04-20 03:16:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:14.146457 | orchestrator | 2026-04-20 03:16:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:17.192429 | orchestrator | 2026-04-20 03:16:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:17.193157 | orchestrator | 2026-04-20 03:16:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:17.193192 | orchestrator | 2026-04-20 03:16:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:20.245160 | orchestrator | 2026-04-20 03:16:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:20.245790 | orchestrator | 2026-04-20 03:16:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:20.245824 | orchestrator | 2026-04-20 03:16:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:23.290783 | orchestrator | 2026-04-20 03:16:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:23.292151 | orchestrator | 2026-04-20 03:16:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:23.292198 | orchestrator | 2026-04-20 03:16:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:26.344931 | orchestrator | 2026-04-20 03:16:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:26.346386 | orchestrator | 2026-04-20 03:16:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:26.346435 | orchestrator | 2026-04-20 03:16:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:29.390837 | orchestrator | 2026-04-20 03:16:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:29.392824 | orchestrator | 2026-04-20 03:16:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:29.392894 | orchestrator | 2026-04-20 03:16:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:32.437037 | orchestrator | 2026-04-20 03:16:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:32.438819 | orchestrator | 2026-04-20 03:16:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:32.438873 | orchestrator | 2026-04-20 03:16:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:35.494595 | orchestrator | 2026-04-20 03:16:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:35.496054 | orchestrator | 2026-04-20 03:16:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:35.496103 | orchestrator | 2026-04-20 03:16:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:38.544707 | orchestrator | 2026-04-20 03:16:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:38.546105 | orchestrator | 2026-04-20 03:16:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:38.546192 | orchestrator | 2026-04-20 03:16:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:41.591618 | orchestrator | 2026-04-20 03:16:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:41.594196 | orchestrator | 2026-04-20 03:16:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:41.594269 | orchestrator | 2026-04-20 03:16:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:44.643107 | orchestrator | 2026-04-20 03:16:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:44.644799 | orchestrator | 2026-04-20 03:16:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:44.644841 | orchestrator | 2026-04-20 03:16:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:47.694398 | orchestrator | 2026-04-20 03:16:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:47.695742 | orchestrator | 2026-04-20 03:16:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:47.695866 | orchestrator | 2026-04-20 03:16:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:50.749031 | orchestrator | 2026-04-20 03:16:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:50.750921 | orchestrator | 2026-04-20 03:16:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:50.750968 | orchestrator | 2026-04-20 03:16:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:53.797284 | orchestrator | 2026-04-20 03:16:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:53.801016 | orchestrator | 2026-04-20 03:16:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:53.801622 | orchestrator | 2026-04-20 03:16:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:56.847824 | orchestrator | 2026-04-20 03:16:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:56.850011 | orchestrator | 2026-04-20 03:16:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:56.850324 | orchestrator | 2026-04-20 03:16:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:16:59.891817 | orchestrator | 2026-04-20 03:16:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:16:59.895111 | orchestrator | 2026-04-20 03:16:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:16:59.895163 | orchestrator | 2026-04-20 03:16:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:02.943838 | orchestrator | 2026-04-20 03:17:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:02.946090 | orchestrator | 2026-04-20 03:17:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:02.946138 | orchestrator | 2026-04-20 03:17:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:05.998288 | orchestrator | 2026-04-20 03:17:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:05.998554 | orchestrator | 2026-04-20 03:17:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:05.998584 | orchestrator | 2026-04-20 03:17:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:09.043682 | orchestrator | 2026-04-20 03:17:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:09.044828 | orchestrator | 2026-04-20 03:17:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:09.044969 | orchestrator | 2026-04-20 03:17:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:12.095293 | orchestrator | 2026-04-20 03:17:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:12.097027 | orchestrator | 2026-04-20 03:17:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:12.097117 | orchestrator | 2026-04-20 03:17:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:15.148759 | orchestrator | 2026-04-20 03:17:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:15.150163 | orchestrator | 2026-04-20 03:17:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:15.150253 | orchestrator | 2026-04-20 03:17:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:18.196915 | orchestrator | 2026-04-20 03:17:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:18.198311 | orchestrator | 2026-04-20 03:17:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:18.198366 | orchestrator | 2026-04-20 03:17:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:21.261321 | orchestrator | 2026-04-20 03:17:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:21.262941 | orchestrator | 2026-04-20 03:17:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:21.263085 | orchestrator | 2026-04-20 03:17:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:24.315339 | orchestrator | 2026-04-20 03:17:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:24.317173 | orchestrator | 2026-04-20 03:17:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:24.317247 | orchestrator | 2026-04-20 03:17:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:27.364934 | orchestrator | 2026-04-20 03:17:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:27.367444 | orchestrator | 2026-04-20 03:17:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:27.367516 | orchestrator | 2026-04-20 03:17:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:30.423705 | orchestrator | 2026-04-20 03:17:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:30.428573 | orchestrator | 2026-04-20 03:17:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:30.428659 | orchestrator | 2026-04-20 03:17:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:33.482998 | orchestrator | 2026-04-20 03:17:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:33.484712 | orchestrator | 2026-04-20 03:17:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:33.484767 | orchestrator | 2026-04-20 03:17:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:36.534290 | orchestrator | 2026-04-20 03:17:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:36.535844 | orchestrator | 2026-04-20 03:17:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:36.535902 | orchestrator | 2026-04-20 03:17:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:39.579759 | orchestrator | 2026-04-20 03:17:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:39.580786 | orchestrator | 2026-04-20 03:17:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:39.580850 | orchestrator | 2026-04-20 03:17:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:42.621672 | orchestrator | 2026-04-20 03:17:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:42.624002 | orchestrator | 2026-04-20 03:17:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:42.624047 | orchestrator | 2026-04-20 03:17:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:45.674103 | orchestrator | 2026-04-20 03:17:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:45.676154 | orchestrator | 2026-04-20 03:17:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:45.676263 | orchestrator | 2026-04-20 03:17:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:48.728788 | orchestrator | 2026-04-20 03:17:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:48.729131 | orchestrator | 2026-04-20 03:17:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:48.729174 | orchestrator | 2026-04-20 03:17:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:51.792439 | orchestrator | 2026-04-20 03:17:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:51.794117 | orchestrator | 2026-04-20 03:17:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:51.794266 | orchestrator | 2026-04-20 03:17:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:54.836767 | orchestrator | 2026-04-20 03:17:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:54.838650 | orchestrator | 2026-04-20 03:17:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:54.838741 | orchestrator | 2026-04-20 03:17:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:17:57.900333 | orchestrator | 2026-04-20 03:17:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:17:57.901338 | orchestrator | 2026-04-20 03:17:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:17:57.901382 | orchestrator | 2026-04-20 03:17:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:00.951469 | orchestrator | 2026-04-20 03:18:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:00.953705 | orchestrator | 2026-04-20 03:18:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:00.953759 | orchestrator | 2026-04-20 03:18:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:04.007539 | orchestrator | 2026-04-20 03:18:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:04.008701 | orchestrator | 2026-04-20 03:18:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:04.008785 | orchestrator | 2026-04-20 03:18:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:07.056584 | orchestrator | 2026-04-20 03:18:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:07.057641 | orchestrator | 2026-04-20 03:18:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:07.057675 | orchestrator | 2026-04-20 03:18:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:10.109199 | orchestrator | 2026-04-20 03:18:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:10.111064 | orchestrator | 2026-04-20 03:18:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:10.111173 | orchestrator | 2026-04-20 03:18:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:13.161369 | orchestrator | 2026-04-20 03:18:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:13.163423 | orchestrator | 2026-04-20 03:18:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:13.163523 | orchestrator | 2026-04-20 03:18:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:16.215592 | orchestrator | 2026-04-20 03:18:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:16.217123 | orchestrator | 2026-04-20 03:18:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:16.217231 | orchestrator | 2026-04-20 03:18:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:19.265901 | orchestrator | 2026-04-20 03:18:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:19.267091 | orchestrator | 2026-04-20 03:18:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:19.267188 | orchestrator | 2026-04-20 03:18:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:22.316777 | orchestrator | 2026-04-20 03:18:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:22.318392 | orchestrator | 2026-04-20 03:18:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:22.318447 | orchestrator | 2026-04-20 03:18:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:25.364483 | orchestrator | 2026-04-20 03:18:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:25.366875 | orchestrator | 2026-04-20 03:18:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:25.366922 | orchestrator | 2026-04-20 03:18:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:28.413964 | orchestrator | 2026-04-20 03:18:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:28.414920 | orchestrator | 2026-04-20 03:18:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:28.414964 | orchestrator | 2026-04-20 03:18:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:31.463623 | orchestrator | 2026-04-20 03:18:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:31.464436 | orchestrator | 2026-04-20 03:18:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:31.464499 | orchestrator | 2026-04-20 03:18:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:34.508115 | orchestrator | 2026-04-20 03:18:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:34.509202 | orchestrator | 2026-04-20 03:18:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:34.509258 | orchestrator | 2026-04-20 03:18:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:37.560600 | orchestrator | 2026-04-20 03:18:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:37.562601 | orchestrator | 2026-04-20 03:18:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:37.562648 | orchestrator | 2026-04-20 03:18:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:40.612105 | orchestrator | 2026-04-20 03:18:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:40.613144 | orchestrator | 2026-04-20 03:18:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:40.613176 | orchestrator | 2026-04-20 03:18:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:43.658919 | orchestrator | 2026-04-20 03:18:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:43.661240 | orchestrator | 2026-04-20 03:18:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:43.661310 | orchestrator | 2026-04-20 03:18:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:46.705303 | orchestrator | 2026-04-20 03:18:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:46.706896 | orchestrator | 2026-04-20 03:18:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:46.706947 | orchestrator | 2026-04-20 03:18:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:49.755785 | orchestrator | 2026-04-20 03:18:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:49.757677 | orchestrator | 2026-04-20 03:18:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:49.757746 | orchestrator | 2026-04-20 03:18:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:52.799635 | orchestrator | 2026-04-20 03:18:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:52.801245 | orchestrator | 2026-04-20 03:18:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:52.801291 | orchestrator | 2026-04-20 03:18:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:55.844532 | orchestrator | 2026-04-20 03:18:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:55.846205 | orchestrator | 2026-04-20 03:18:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:55.846393 | orchestrator | 2026-04-20 03:18:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:18:58.896495 | orchestrator | 2026-04-20 03:18:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:18:58.898399 | orchestrator | 2026-04-20 03:18:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:18:58.898465 | orchestrator | 2026-04-20 03:18:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:01.947060 | orchestrator | 2026-04-20 03:19:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:01.950805 | orchestrator | 2026-04-20 03:19:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:01.950869 | orchestrator | 2026-04-20 03:19:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:04.996457 | orchestrator | 2026-04-20 03:19:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:05.000262 | orchestrator | 2026-04-20 03:19:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:05.000452 | orchestrator | 2026-04-20 03:19:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:08.052225 | orchestrator | 2026-04-20 03:19:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:08.053447 | orchestrator | 2026-04-20 03:19:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:08.053474 | orchestrator | 2026-04-20 03:19:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:11.099127 | orchestrator | 2026-04-20 03:19:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:11.102201 | orchestrator | 2026-04-20 03:19:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:11.102307 | orchestrator | 2026-04-20 03:19:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:14.157474 | orchestrator | 2026-04-20 03:19:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:14.159217 | orchestrator | 2026-04-20 03:19:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:14.159314 | orchestrator | 2026-04-20 03:19:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:17.208316 | orchestrator | 2026-04-20 03:19:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:17.209812 | orchestrator | 2026-04-20 03:19:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:17.209869 | orchestrator | 2026-04-20 03:19:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:20.265248 | orchestrator | 2026-04-20 03:19:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:20.265796 | orchestrator | 2026-04-20 03:19:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:20.265829 | orchestrator | 2026-04-20 03:19:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:23.315858 | orchestrator | 2026-04-20 03:19:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:23.319519 | orchestrator | 2026-04-20 03:19:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:23.319585 | orchestrator | 2026-04-20 03:19:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:26.362937 | orchestrator | 2026-04-20 03:19:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:26.364193 | orchestrator | 2026-04-20 03:19:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:26.364244 | orchestrator | 2026-04-20 03:19:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:29.415309 | orchestrator | 2026-04-20 03:19:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:29.417098 | orchestrator | 2026-04-20 03:19:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:29.417142 | orchestrator | 2026-04-20 03:19:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:32.466120 | orchestrator | 2026-04-20 03:19:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:32.468596 | orchestrator | 2026-04-20 03:19:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:32.469642 | orchestrator | 2026-04-20 03:19:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:35.514745 | orchestrator | 2026-04-20 03:19:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:35.514883 | orchestrator | 2026-04-20 03:19:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:35.514900 | orchestrator | 2026-04-20 03:19:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:38.562233 | orchestrator | 2026-04-20 03:19:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:38.562576 | orchestrator | 2026-04-20 03:19:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:38.562616 | orchestrator | 2026-04-20 03:19:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:41.608029 | orchestrator | 2026-04-20 03:19:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:41.610626 | orchestrator | 2026-04-20 03:19:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:41.610674 | orchestrator | 2026-04-20 03:19:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:44.660662 | orchestrator | 2026-04-20 03:19:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:44.663209 | orchestrator | 2026-04-20 03:19:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:44.663305 | orchestrator | 2026-04-20 03:19:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:47.711910 | orchestrator | 2026-04-20 03:19:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:47.714238 | orchestrator | 2026-04-20 03:19:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:47.714491 | orchestrator | 2026-04-20 03:19:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:50.764469 | orchestrator | 2026-04-20 03:19:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:50.766216 | orchestrator | 2026-04-20 03:19:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:50.766304 | orchestrator | 2026-04-20 03:19:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:53.818861 | orchestrator | 2026-04-20 03:19:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:53.819716 | orchestrator | 2026-04-20 03:19:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:53.819746 | orchestrator | 2026-04-20 03:19:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:56.870350 | orchestrator | 2026-04-20 03:19:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:56.872651 | orchestrator | 2026-04-20 03:19:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:56.872824 | orchestrator | 2026-04-20 03:19:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:19:59.916516 | orchestrator | 2026-04-20 03:19:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:19:59.919085 | orchestrator | 2026-04-20 03:19:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:19:59.919146 | orchestrator | 2026-04-20 03:19:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:02.970353 | orchestrator | 2026-04-20 03:20:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:02.971129 | orchestrator | 2026-04-20 03:20:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:02.971176 | orchestrator | 2026-04-20 03:20:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:06.023644 | orchestrator | 2026-04-20 03:20:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:06.024798 | orchestrator | 2026-04-20 03:20:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:06.024852 | orchestrator | 2026-04-20 03:20:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:09.066569 | orchestrator | 2026-04-20 03:20:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:09.067380 | orchestrator | 2026-04-20 03:20:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:09.067457 | orchestrator | 2026-04-20 03:20:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:12.109950 | orchestrator | 2026-04-20 03:20:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:12.111294 | orchestrator | 2026-04-20 03:20:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:12.111359 | orchestrator | 2026-04-20 03:20:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:15.150328 | orchestrator | 2026-04-20 03:20:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:15.152703 | orchestrator | 2026-04-20 03:20:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:15.152768 | orchestrator | 2026-04-20 03:20:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:18.195604 | orchestrator | 2026-04-20 03:20:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:18.197075 | orchestrator | 2026-04-20 03:20:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:18.197136 | orchestrator | 2026-04-20 03:20:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:21.238668 | orchestrator | 2026-04-20 03:20:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:21.240018 | orchestrator | 2026-04-20 03:20:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:21.240098 | orchestrator | 2026-04-20 03:20:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:24.286935 | orchestrator | 2026-04-20 03:20:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:24.288232 | orchestrator | 2026-04-20 03:20:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:24.288290 | orchestrator | 2026-04-20 03:20:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:27.328207 | orchestrator | 2026-04-20 03:20:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:27.329529 | orchestrator | 2026-04-20 03:20:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:27.329580 | orchestrator | 2026-04-20 03:20:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:30.372466 | orchestrator | 2026-04-20 03:20:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:30.375447 | orchestrator | 2026-04-20 03:20:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:30.375576 | orchestrator | 2026-04-20 03:20:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:33.422473 | orchestrator | 2026-04-20 03:20:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:33.423770 | orchestrator | 2026-04-20 03:20:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:33.423829 | orchestrator | 2026-04-20 03:20:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:36.471473 | orchestrator | 2026-04-20 03:20:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:36.475243 | orchestrator | 2026-04-20 03:20:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:36.475454 | orchestrator | 2026-04-20 03:20:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:39.526482 | orchestrator | 2026-04-20 03:20:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:39.527913 | orchestrator | 2026-04-20 03:20:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:39.527998 | orchestrator | 2026-04-20 03:20:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:42.583989 | orchestrator | 2026-04-20 03:20:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:42.587604 | orchestrator | 2026-04-20 03:20:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:42.587676 | orchestrator | 2026-04-20 03:20:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:45.640878 | orchestrator | 2026-04-20 03:20:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:45.642396 | orchestrator | 2026-04-20 03:20:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:45.642431 | orchestrator | 2026-04-20 03:20:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:48.691242 | orchestrator | 2026-04-20 03:20:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:48.693482 | orchestrator | 2026-04-20 03:20:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:48.693564 | orchestrator | 2026-04-20 03:20:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:51.738805 | orchestrator | 2026-04-20 03:20:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:51.740534 | orchestrator | 2026-04-20 03:20:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:51.740624 | orchestrator | 2026-04-20 03:20:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:54.790404 | orchestrator | 2026-04-20 03:20:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:54.792740 | orchestrator | 2026-04-20 03:20:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:54.792790 | orchestrator | 2026-04-20 03:20:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:20:57.851175 | orchestrator | 2026-04-20 03:20:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:20:57.855019 | orchestrator | 2026-04-20 03:20:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:20:57.855080 | orchestrator | 2026-04-20 03:20:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:00.899728 | orchestrator | 2026-04-20 03:21:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:00.900418 | orchestrator | 2026-04-20 03:21:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:00.900461 | orchestrator | 2026-04-20 03:21:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:03.953163 | orchestrator | 2026-04-20 03:21:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:03.954433 | orchestrator | 2026-04-20 03:21:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:03.954483 | orchestrator | 2026-04-20 03:21:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:07.011154 | orchestrator | 2026-04-20 03:21:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:07.015416 | orchestrator | 2026-04-20 03:21:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:07.015462 | orchestrator | 2026-04-20 03:21:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:10.068636 | orchestrator | 2026-04-20 03:21:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:10.070621 | orchestrator | 2026-04-20 03:21:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:10.070660 | orchestrator | 2026-04-20 03:21:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:13.125532 | orchestrator | 2026-04-20 03:21:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:13.129261 | orchestrator | 2026-04-20 03:21:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:13.129358 | orchestrator | 2026-04-20 03:21:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:16.178297 | orchestrator | 2026-04-20 03:21:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:16.179905 | orchestrator | 2026-04-20 03:21:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:16.179943 | orchestrator | 2026-04-20 03:21:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:19.231887 | orchestrator | 2026-04-20 03:21:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:19.232556 | orchestrator | 2026-04-20 03:21:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:19.232599 | orchestrator | 2026-04-20 03:21:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:22.276976 | orchestrator | 2026-04-20 03:21:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:22.278723 | orchestrator | 2026-04-20 03:21:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:22.278772 | orchestrator | 2026-04-20 03:21:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:25.331678 | orchestrator | 2026-04-20 03:21:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:25.333714 | orchestrator | 2026-04-20 03:21:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:25.333779 | orchestrator | 2026-04-20 03:21:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:28.382595 | orchestrator | 2026-04-20 03:21:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:28.385354 | orchestrator | 2026-04-20 03:21:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:28.385413 | orchestrator | 2026-04-20 03:21:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:31.431206 | orchestrator | 2026-04-20 03:21:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:31.431682 | orchestrator | 2026-04-20 03:21:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:31.431716 | orchestrator | 2026-04-20 03:21:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:34.481580 | orchestrator | 2026-04-20 03:21:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:34.483316 | orchestrator | 2026-04-20 03:21:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:34.483373 | orchestrator | 2026-04-20 03:21:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:37.527645 | orchestrator | 2026-04-20 03:21:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:37.530327 | orchestrator | 2026-04-20 03:21:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:37.530397 | orchestrator | 2026-04-20 03:21:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:40.581072 | orchestrator | 2026-04-20 03:21:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:40.583042 | orchestrator | 2026-04-20 03:21:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:40.583219 | orchestrator | 2026-04-20 03:21:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:43.638361 | orchestrator | 2026-04-20 03:21:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:43.640317 | orchestrator | 2026-04-20 03:21:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:43.640471 | orchestrator | 2026-04-20 03:21:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:46.685867 | orchestrator | 2026-04-20 03:21:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:46.687698 | orchestrator | 2026-04-20 03:21:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:46.687786 | orchestrator | 2026-04-20 03:21:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:49.733777 | orchestrator | 2026-04-20 03:21:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:49.735709 | orchestrator | 2026-04-20 03:21:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:49.735885 | orchestrator | 2026-04-20 03:21:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:52.788459 | orchestrator | 2026-04-20 03:21:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:52.790001 | orchestrator | 2026-04-20 03:21:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:52.790170 | orchestrator | 2026-04-20 03:21:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:55.833336 | orchestrator | 2026-04-20 03:21:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:55.835160 | orchestrator | 2026-04-20 03:21:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:55.835216 | orchestrator | 2026-04-20 03:21:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:21:58.884900 | orchestrator | 2026-04-20 03:21:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:21:58.886716 | orchestrator | 2026-04-20 03:21:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:21:58.886795 | orchestrator | 2026-04-20 03:21:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:01.936586 | orchestrator | 2026-04-20 03:22:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:01.938314 | orchestrator | 2026-04-20 03:22:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:01.938370 | orchestrator | 2026-04-20 03:22:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:04.985240 | orchestrator | 2026-04-20 03:22:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:04.986474 | orchestrator | 2026-04-20 03:22:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:04.986536 | orchestrator | 2026-04-20 03:22:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:08.033428 | orchestrator | 2026-04-20 03:22:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:08.038215 | orchestrator | 2026-04-20 03:22:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:08.038352 | orchestrator | 2026-04-20 03:22:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:11.087861 | orchestrator | 2026-04-20 03:22:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:11.090007 | orchestrator | 2026-04-20 03:22:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:11.090090 | orchestrator | 2026-04-20 03:22:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:14.141775 | orchestrator | 2026-04-20 03:22:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:14.144366 | orchestrator | 2026-04-20 03:22:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:14.144490 | orchestrator | 2026-04-20 03:22:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:17.199953 | orchestrator | 2026-04-20 03:22:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:17.202976 | orchestrator | 2026-04-20 03:22:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:17.203079 | orchestrator | 2026-04-20 03:22:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:20.252494 | orchestrator | 2026-04-20 03:22:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:20.253482 | orchestrator | 2026-04-20 03:22:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:20.253792 | orchestrator | 2026-04-20 03:22:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:23.302605 | orchestrator | 2026-04-20 03:22:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:23.305099 | orchestrator | 2026-04-20 03:22:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:23.305203 | orchestrator | 2026-04-20 03:22:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:26.350368 | orchestrator | 2026-04-20 03:22:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:26.352155 | orchestrator | 2026-04-20 03:22:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:26.352210 | orchestrator | 2026-04-20 03:22:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:29.392808 | orchestrator | 2026-04-20 03:22:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:29.395237 | orchestrator | 2026-04-20 03:22:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:29.395309 | orchestrator | 2026-04-20 03:22:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:32.439525 | orchestrator | 2026-04-20 03:22:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:32.442220 | orchestrator | 2026-04-20 03:22:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:32.442289 | orchestrator | 2026-04-20 03:22:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:35.491140 | orchestrator | 2026-04-20 03:22:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:35.495794 | orchestrator | 2026-04-20 03:22:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:35.496473 | orchestrator | 2026-04-20 03:22:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:38.549849 | orchestrator | 2026-04-20 03:22:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:38.550220 | orchestrator | 2026-04-20 03:22:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:38.550247 | orchestrator | 2026-04-20 03:22:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:41.592910 | orchestrator | 2026-04-20 03:22:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:41.595383 | orchestrator | 2026-04-20 03:22:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:41.595509 | orchestrator | 2026-04-20 03:22:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:44.653275 | orchestrator | 2026-04-20 03:22:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:44.655793 | orchestrator | 2026-04-20 03:22:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:44.655877 | orchestrator | 2026-04-20 03:22:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:47.708026 | orchestrator | 2026-04-20 03:22:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:47.710821 | orchestrator | 2026-04-20 03:22:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:47.710865 | orchestrator | 2026-04-20 03:22:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:50.760496 | orchestrator | 2026-04-20 03:22:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:50.762664 | orchestrator | 2026-04-20 03:22:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:50.762742 | orchestrator | 2026-04-20 03:22:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:53.804903 | orchestrator | 2026-04-20 03:22:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:53.806626 | orchestrator | 2026-04-20 03:22:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:53.806710 | orchestrator | 2026-04-20 03:22:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:56.852186 | orchestrator | 2026-04-20 03:22:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:56.855061 | orchestrator | 2026-04-20 03:22:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:56.855174 | orchestrator | 2026-04-20 03:22:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:22:59.902635 | orchestrator | 2026-04-20 03:22:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:22:59.904703 | orchestrator | 2026-04-20 03:22:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:22:59.904767 | orchestrator | 2026-04-20 03:22:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:02.955116 | orchestrator | 2026-04-20 03:23:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:02.956450 | orchestrator | 2026-04-20 03:23:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:02.956500 | orchestrator | 2026-04-20 03:23:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:06.002532 | orchestrator | 2026-04-20 03:23:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:06.004540 | orchestrator | 2026-04-20 03:23:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:06.004613 | orchestrator | 2026-04-20 03:23:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:09.044230 | orchestrator | 2026-04-20 03:23:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:09.044683 | orchestrator | 2026-04-20 03:23:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:09.044709 | orchestrator | 2026-04-20 03:23:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:12.099831 | orchestrator | 2026-04-20 03:23:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:12.102489 | orchestrator | 2026-04-20 03:23:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:12.102661 | orchestrator | 2026-04-20 03:23:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:15.157491 | orchestrator | 2026-04-20 03:23:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:15.159353 | orchestrator | 2026-04-20 03:23:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:15.159437 | orchestrator | 2026-04-20 03:23:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:18.205500 | orchestrator | 2026-04-20 03:23:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:18.206984 | orchestrator | 2026-04-20 03:23:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:18.207034 | orchestrator | 2026-04-20 03:23:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:21.254210 | orchestrator | 2026-04-20 03:23:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:21.256517 | orchestrator | 2026-04-20 03:23:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:21.256593 | orchestrator | 2026-04-20 03:23:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:24.304805 | orchestrator | 2026-04-20 03:23:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:24.306256 | orchestrator | 2026-04-20 03:23:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:24.306313 | orchestrator | 2026-04-20 03:23:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:27.347364 | orchestrator | 2026-04-20 03:23:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:27.349087 | orchestrator | 2026-04-20 03:23:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:27.349152 | orchestrator | 2026-04-20 03:23:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:30.393727 | orchestrator | 2026-04-20 03:23:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:30.396271 | orchestrator | 2026-04-20 03:23:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:30.396342 | orchestrator | 2026-04-20 03:23:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:33.444694 | orchestrator | 2026-04-20 03:23:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:33.445854 | orchestrator | 2026-04-20 03:23:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:33.446276 | orchestrator | 2026-04-20 03:23:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:36.499541 | orchestrator | 2026-04-20 03:23:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:36.502450 | orchestrator | 2026-04-20 03:23:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:36.502538 | orchestrator | 2026-04-20 03:23:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:39.555719 | orchestrator | 2026-04-20 03:23:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:39.558548 | orchestrator | 2026-04-20 03:23:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:39.558630 | orchestrator | 2026-04-20 03:23:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:42.606378 | orchestrator | 2026-04-20 03:23:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:42.608115 | orchestrator | 2026-04-20 03:23:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:42.608157 | orchestrator | 2026-04-20 03:23:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:45.659352 | orchestrator | 2026-04-20 03:23:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:45.661201 | orchestrator | 2026-04-20 03:23:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:45.661247 | orchestrator | 2026-04-20 03:23:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:48.717758 | orchestrator | 2026-04-20 03:23:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:48.718652 | orchestrator | 2026-04-20 03:23:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:48.718715 | orchestrator | 2026-04-20 03:23:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:51.766534 | orchestrator | 2026-04-20 03:23:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:51.768799 | orchestrator | 2026-04-20 03:23:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:51.768843 | orchestrator | 2026-04-20 03:23:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:54.817481 | orchestrator | 2026-04-20 03:23:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:54.818714 | orchestrator | 2026-04-20 03:23:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:54.818748 | orchestrator | 2026-04-20 03:23:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:23:57.876326 | orchestrator | 2026-04-20 03:23:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:23:57.878875 | orchestrator | 2026-04-20 03:23:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:23:57.879007 | orchestrator | 2026-04-20 03:23:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:24:00.922950 | orchestrator | 2026-04-20 03:24:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:24:00.925104 | orchestrator | 2026-04-20 03:24:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:24:00.925233 | orchestrator | 2026-04-20 03:24:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:24:03.979171 | orchestrator | 2026-04-20 03:24:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:24:03.981735 | orchestrator | 2026-04-20 03:24:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:24:03.981858 | orchestrator | 2026-04-20 03:24:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:24:07.025249 | orchestrator | 2026-04-20 03:24:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:24:07.027230 | orchestrator | 2026-04-20 03:24:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:24:07.027288 | orchestrator | 2026-04-20 03:24:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:24:10.070653 | orchestrator | 2026-04-20 03:24:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:24:10.071490 | orchestrator | 2026-04-20 03:24:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:24:10.071543 | orchestrator | 2026-04-20 03:24:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:13.222457 | orchestrator | 2026-04-20 03:26:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:13.222560 | orchestrator | 2026-04-20 03:26:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:13.222573 | orchestrator | 2026-04-20 03:26:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:16.265999 | orchestrator | 2026-04-20 03:26:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:16.267613 | orchestrator | 2026-04-20 03:26:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:16.267683 | orchestrator | 2026-04-20 03:26:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:19.310378 | orchestrator | 2026-04-20 03:26:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:19.311437 | orchestrator | 2026-04-20 03:26:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:19.311505 | orchestrator | 2026-04-20 03:26:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:22.360495 | orchestrator | 2026-04-20 03:26:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:22.361495 | orchestrator | 2026-04-20 03:26:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:22.361556 | orchestrator | 2026-04-20 03:26:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:25.406757 | orchestrator | 2026-04-20 03:26:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:25.408943 | orchestrator | 2026-04-20 03:26:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:25.408992 | orchestrator | 2026-04-20 03:26:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:28.444596 | orchestrator | 2026-04-20 03:26:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:28.444782 | orchestrator | 2026-04-20 03:26:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:28.444803 | orchestrator | 2026-04-20 03:26:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:31.499413 | orchestrator | 2026-04-20 03:26:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:31.502389 | orchestrator | 2026-04-20 03:26:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:31.502512 | orchestrator | 2026-04-20 03:26:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:34.546742 | orchestrator | 2026-04-20 03:26:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:34.548922 | orchestrator | 2026-04-20 03:26:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:34.548991 | orchestrator | 2026-04-20 03:26:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:37.584984 | orchestrator | 2026-04-20 03:26:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:37.586518 | orchestrator | 2026-04-20 03:26:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:37.586737 | orchestrator | 2026-04-20 03:26:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:40.632503 | orchestrator | 2026-04-20 03:26:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:40.635777 | orchestrator | 2026-04-20 03:26:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:40.635935 | orchestrator | 2026-04-20 03:26:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:43.681588 | orchestrator | 2026-04-20 03:26:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:43.682907 | orchestrator | 2026-04-20 03:26:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:43.682948 | orchestrator | 2026-04-20 03:26:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:46.720078 | orchestrator | 2026-04-20 03:26:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:46.721910 | orchestrator | 2026-04-20 03:26:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:46.721978 | orchestrator | 2026-04-20 03:26:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:49.767109 | orchestrator | 2026-04-20 03:26:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:49.768976 | orchestrator | 2026-04-20 03:26:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:49.769048 | orchestrator | 2026-04-20 03:26:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:52.814198 | orchestrator | 2026-04-20 03:26:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:52.815407 | orchestrator | 2026-04-20 03:26:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:52.815688 | orchestrator | 2026-04-20 03:26:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:55.861878 | orchestrator | 2026-04-20 03:26:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:55.863891 | orchestrator | 2026-04-20 03:26:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:55.863944 | orchestrator | 2026-04-20 03:26:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:26:58.908483 | orchestrator | 2026-04-20 03:26:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:26:58.909628 | orchestrator | 2026-04-20 03:26:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:26:58.909708 | orchestrator | 2026-04-20 03:26:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:01.960688 | orchestrator | 2026-04-20 03:27:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:01.962664 | orchestrator | 2026-04-20 03:27:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:01.962711 | orchestrator | 2026-04-20 03:27:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:05.007190 | orchestrator | 2026-04-20 03:27:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:05.010290 | orchestrator | 2026-04-20 03:27:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:05.010997 | orchestrator | 2026-04-20 03:27:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:08.053162 | orchestrator | 2026-04-20 03:27:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:08.054851 | orchestrator | 2026-04-20 03:27:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:08.054926 | orchestrator | 2026-04-20 03:27:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:11.100281 | orchestrator | 2026-04-20 03:27:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:11.103243 | orchestrator | 2026-04-20 03:27:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:11.103310 | orchestrator | 2026-04-20 03:27:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:14.146237 | orchestrator | 2026-04-20 03:27:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:14.147695 | orchestrator | 2026-04-20 03:27:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:14.147760 | orchestrator | 2026-04-20 03:27:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:17.197750 | orchestrator | 2026-04-20 03:27:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:17.199935 | orchestrator | 2026-04-20 03:27:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:17.199971 | orchestrator | 2026-04-20 03:27:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:20.246221 | orchestrator | 2026-04-20 03:27:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:20.248950 | orchestrator | 2026-04-20 03:27:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:20.249008 | orchestrator | 2026-04-20 03:27:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:23.288655 | orchestrator | 2026-04-20 03:27:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:23.291605 | orchestrator | 2026-04-20 03:27:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:23.291683 | orchestrator | 2026-04-20 03:27:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:26.338918 | orchestrator | 2026-04-20 03:27:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:26.339675 | orchestrator | 2026-04-20 03:27:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:26.339730 | orchestrator | 2026-04-20 03:27:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:29.382261 | orchestrator | 2026-04-20 03:27:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:29.384599 | orchestrator | 2026-04-20 03:27:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:29.384668 | orchestrator | 2026-04-20 03:27:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:32.433865 | orchestrator | 2026-04-20 03:27:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:32.436081 | orchestrator | 2026-04-20 03:27:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:32.436135 | orchestrator | 2026-04-20 03:27:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:35.483768 | orchestrator | 2026-04-20 03:27:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:35.487487 | orchestrator | 2026-04-20 03:27:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:35.487537 | orchestrator | 2026-04-20 03:27:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:38.532121 | orchestrator | 2026-04-20 03:27:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:38.534302 | orchestrator | 2026-04-20 03:27:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:38.534364 | orchestrator | 2026-04-20 03:27:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:41.585974 | orchestrator | 2026-04-20 03:27:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:41.586393 | orchestrator | 2026-04-20 03:27:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:41.586419 | orchestrator | 2026-04-20 03:27:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:44.639370 | orchestrator | 2026-04-20 03:27:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:44.641972 | orchestrator | 2026-04-20 03:27:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:44.642145 | orchestrator | 2026-04-20 03:27:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:47.692825 | orchestrator | 2026-04-20 03:27:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:47.693913 | orchestrator | 2026-04-20 03:27:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:47.693996 | orchestrator | 2026-04-20 03:27:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:50.738443 | orchestrator | 2026-04-20 03:27:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:50.741863 | orchestrator | 2026-04-20 03:27:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:50.741920 | orchestrator | 2026-04-20 03:27:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:53.786123 | orchestrator | 2026-04-20 03:27:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:53.788472 | orchestrator | 2026-04-20 03:27:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:53.788528 | orchestrator | 2026-04-20 03:27:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:56.829602 | orchestrator | 2026-04-20 03:27:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:56.830906 | orchestrator | 2026-04-20 03:27:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:56.830942 | orchestrator | 2026-04-20 03:27:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:27:59.872746 | orchestrator | 2026-04-20 03:27:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:27:59.874873 | orchestrator | 2026-04-20 03:27:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:27:59.874909 | orchestrator | 2026-04-20 03:27:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:02.920350 | orchestrator | 2026-04-20 03:28:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:02.930547 | orchestrator | 2026-04-20 03:28:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:02.930650 | orchestrator | 2026-04-20 03:28:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:05.969948 | orchestrator | 2026-04-20 03:28:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:05.972091 | orchestrator | 2026-04-20 03:28:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:05.972361 | orchestrator | 2026-04-20 03:28:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:09.011820 | orchestrator | 2026-04-20 03:28:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:09.013885 | orchestrator | 2026-04-20 03:28:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:09.014011 | orchestrator | 2026-04-20 03:28:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:12.066849 | orchestrator | 2026-04-20 03:28:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:12.068265 | orchestrator | 2026-04-20 03:28:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:12.068633 | orchestrator | 2026-04-20 03:28:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:15.111492 | orchestrator | 2026-04-20 03:28:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:15.112632 | orchestrator | 2026-04-20 03:28:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:15.112723 | orchestrator | 2026-04-20 03:28:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:18.152511 | orchestrator | 2026-04-20 03:28:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:18.152661 | orchestrator | 2026-04-20 03:28:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:18.152677 | orchestrator | 2026-04-20 03:28:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:21.193656 | orchestrator | 2026-04-20 03:28:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:21.195737 | orchestrator | 2026-04-20 03:28:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:21.196213 | orchestrator | 2026-04-20 03:28:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:24.240813 | orchestrator | 2026-04-20 03:28:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:24.241823 | orchestrator | 2026-04-20 03:28:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:24.241906 | orchestrator | 2026-04-20 03:28:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:27.283691 | orchestrator | 2026-04-20 03:28:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:27.284269 | orchestrator | 2026-04-20 03:28:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:27.284298 | orchestrator | 2026-04-20 03:28:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:30.322300 | orchestrator | 2026-04-20 03:28:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:30.323862 | orchestrator | 2026-04-20 03:28:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:30.323914 | orchestrator | 2026-04-20 03:28:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:33.376049 | orchestrator | 2026-04-20 03:28:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:33.378372 | orchestrator | 2026-04-20 03:28:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:33.378463 | orchestrator | 2026-04-20 03:28:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:36.420137 | orchestrator | 2026-04-20 03:28:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:36.421603 | orchestrator | 2026-04-20 03:28:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:36.421683 | orchestrator | 2026-04-20 03:28:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:39.462289 | orchestrator | 2026-04-20 03:28:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:39.462587 | orchestrator | 2026-04-20 03:28:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:39.462610 | orchestrator | 2026-04-20 03:28:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:42.500289 | orchestrator | 2026-04-20 03:28:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:42.503660 | orchestrator | 2026-04-20 03:28:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:42.503805 | orchestrator | 2026-04-20 03:28:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:45.543869 | orchestrator | 2026-04-20 03:28:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:45.545644 | orchestrator | 2026-04-20 03:28:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:45.545699 | orchestrator | 2026-04-20 03:28:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:48.586379 | orchestrator | 2026-04-20 03:28:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:48.587996 | orchestrator | 2026-04-20 03:28:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:48.588045 | orchestrator | 2026-04-20 03:28:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:51.629207 | orchestrator | 2026-04-20 03:28:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:51.631354 | orchestrator | 2026-04-20 03:28:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:51.631397 | orchestrator | 2026-04-20 03:28:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:54.679932 | orchestrator | 2026-04-20 03:28:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:54.681990 | orchestrator | 2026-04-20 03:28:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:54.682111 | orchestrator | 2026-04-20 03:28:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:28:57.727707 | orchestrator | 2026-04-20 03:28:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:28:57.729175 | orchestrator | 2026-04-20 03:28:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:28:57.729235 | orchestrator | 2026-04-20 03:28:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:00.768509 | orchestrator | 2026-04-20 03:29:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:00.768883 | orchestrator | 2026-04-20 03:29:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:00.768924 | orchestrator | 2026-04-20 03:29:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:03.815211 | orchestrator | 2026-04-20 03:29:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:03.816435 | orchestrator | 2026-04-20 03:29:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:03.816479 | orchestrator | 2026-04-20 03:29:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:06.860196 | orchestrator | 2026-04-20 03:29:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:06.862491 | orchestrator | 2026-04-20 03:29:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:06.862574 | orchestrator | 2026-04-20 03:29:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:09.908583 | orchestrator | 2026-04-20 03:29:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:09.911013 | orchestrator | 2026-04-20 03:29:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:09.911049 | orchestrator | 2026-04-20 03:29:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:12.956102 | orchestrator | 2026-04-20 03:29:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:12.957503 | orchestrator | 2026-04-20 03:29:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:12.957744 | orchestrator | 2026-04-20 03:29:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:15.995381 | orchestrator | 2026-04-20 03:29:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:15.996687 | orchestrator | 2026-04-20 03:29:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:15.996812 | orchestrator | 2026-04-20 03:29:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:19.043722 | orchestrator | 2026-04-20 03:29:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:19.045026 | orchestrator | 2026-04-20 03:29:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:19.045091 | orchestrator | 2026-04-20 03:29:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:22.085427 | orchestrator | 2026-04-20 03:29:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:22.088122 | orchestrator | 2026-04-20 03:29:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:22.088170 | orchestrator | 2026-04-20 03:29:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:25.136741 | orchestrator | 2026-04-20 03:29:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:25.137946 | orchestrator | 2026-04-20 03:29:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:25.138001 | orchestrator | 2026-04-20 03:29:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:28.185385 | orchestrator | 2026-04-20 03:29:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:28.187168 | orchestrator | 2026-04-20 03:29:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:28.187218 | orchestrator | 2026-04-20 03:29:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:31.233084 | orchestrator | 2026-04-20 03:29:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:31.233869 | orchestrator | 2026-04-20 03:29:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:31.234003 | orchestrator | 2026-04-20 03:29:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:34.273005 | orchestrator | 2026-04-20 03:29:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:34.274009 | orchestrator | 2026-04-20 03:29:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:34.274097 | orchestrator | 2026-04-20 03:29:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:37.329469 | orchestrator | 2026-04-20 03:29:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:37.331746 | orchestrator | 2026-04-20 03:29:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:37.331845 | orchestrator | 2026-04-20 03:29:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:40.377275 | orchestrator | 2026-04-20 03:29:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:40.378885 | orchestrator | 2026-04-20 03:29:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:40.378997 | orchestrator | 2026-04-20 03:29:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:43.428862 | orchestrator | 2026-04-20 03:29:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:43.430845 | orchestrator | 2026-04-20 03:29:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:43.430894 | orchestrator | 2026-04-20 03:29:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:46.478073 | orchestrator | 2026-04-20 03:29:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:46.480244 | orchestrator | 2026-04-20 03:29:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:46.480362 | orchestrator | 2026-04-20 03:29:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:49.526278 | orchestrator | 2026-04-20 03:29:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:49.527853 | orchestrator | 2026-04-20 03:29:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:49.527893 | orchestrator | 2026-04-20 03:29:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:52.565540 | orchestrator | 2026-04-20 03:29:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:52.567717 | orchestrator | 2026-04-20 03:29:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:52.567763 | orchestrator | 2026-04-20 03:29:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:55.609904 | orchestrator | 2026-04-20 03:29:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:55.612719 | orchestrator | 2026-04-20 03:29:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:55.612785 | orchestrator | 2026-04-20 03:29:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:29:58.655349 | orchestrator | 2026-04-20 03:29:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:29:58.657367 | orchestrator | 2026-04-20 03:29:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:29:58.657422 | orchestrator | 2026-04-20 03:29:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:01.700524 | orchestrator | 2026-04-20 03:30:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:01.702107 | orchestrator | 2026-04-20 03:30:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:01.702162 | orchestrator | 2026-04-20 03:30:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:04.752438 | orchestrator | 2026-04-20 03:30:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:04.753679 | orchestrator | 2026-04-20 03:30:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:04.753746 | orchestrator | 2026-04-20 03:30:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:07.801327 | orchestrator | 2026-04-20 03:30:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:07.802172 | orchestrator | 2026-04-20 03:30:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:07.802190 | orchestrator | 2026-04-20 03:30:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:10.844585 | orchestrator | 2026-04-20 03:30:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:10.846263 | orchestrator | 2026-04-20 03:30:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:10.846334 | orchestrator | 2026-04-20 03:30:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:13.890631 | orchestrator | 2026-04-20 03:30:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:13.892074 | orchestrator | 2026-04-20 03:30:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:13.892152 | orchestrator | 2026-04-20 03:30:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:16.929327 | orchestrator | 2026-04-20 03:30:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:16.931362 | orchestrator | 2026-04-20 03:30:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:16.931405 | orchestrator | 2026-04-20 03:30:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:19.976106 | orchestrator | 2026-04-20 03:30:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:19.979158 | orchestrator | 2026-04-20 03:30:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:19.979282 | orchestrator | 2026-04-20 03:30:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:23.016238 | orchestrator | 2026-04-20 03:30:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:23.018313 | orchestrator | 2026-04-20 03:30:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:23.018503 | orchestrator | 2026-04-20 03:30:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:26.061641 | orchestrator | 2026-04-20 03:30:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:26.063980 | orchestrator | 2026-04-20 03:30:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:26.064077 | orchestrator | 2026-04-20 03:30:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:29.109996 | orchestrator | 2026-04-20 03:30:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:29.112455 | orchestrator | 2026-04-20 03:30:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:29.112511 | orchestrator | 2026-04-20 03:30:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:32.160211 | orchestrator | 2026-04-20 03:30:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:32.162135 | orchestrator | 2026-04-20 03:30:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:32.162195 | orchestrator | 2026-04-20 03:30:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:35.210095 | orchestrator | 2026-04-20 03:30:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:35.210889 | orchestrator | 2026-04-20 03:30:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:35.210957 | orchestrator | 2026-04-20 03:30:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:38.250893 | orchestrator | 2026-04-20 03:30:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:38.253402 | orchestrator | 2026-04-20 03:30:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:38.253462 | orchestrator | 2026-04-20 03:30:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:41.302354 | orchestrator | 2026-04-20 03:30:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:41.303641 | orchestrator | 2026-04-20 03:30:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:41.303692 | orchestrator | 2026-04-20 03:30:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:44.349662 | orchestrator | 2026-04-20 03:30:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:44.350964 | orchestrator | 2026-04-20 03:30:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:44.351016 | orchestrator | 2026-04-20 03:30:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:47.402184 | orchestrator | 2026-04-20 03:30:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:47.403796 | orchestrator | 2026-04-20 03:30:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:47.403836 | orchestrator | 2026-04-20 03:30:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:50.446169 | orchestrator | 2026-04-20 03:30:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:50.449504 | orchestrator | 2026-04-20 03:30:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:50.449578 | orchestrator | 2026-04-20 03:30:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:53.501908 | orchestrator | 2026-04-20 03:30:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:53.504579 | orchestrator | 2026-04-20 03:30:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:53.504677 | orchestrator | 2026-04-20 03:30:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:56.550551 | orchestrator | 2026-04-20 03:30:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:56.551940 | orchestrator | 2026-04-20 03:30:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:56.551990 | orchestrator | 2026-04-20 03:30:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:30:59.595068 | orchestrator | 2026-04-20 03:30:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:30:59.596636 | orchestrator | 2026-04-20 03:30:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:30:59.596690 | orchestrator | 2026-04-20 03:30:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:02.638135 | orchestrator | 2026-04-20 03:31:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:02.638219 | orchestrator | 2026-04-20 03:31:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:02.638229 | orchestrator | 2026-04-20 03:31:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:05.676904 | orchestrator | 2026-04-20 03:31:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:05.678651 | orchestrator | 2026-04-20 03:31:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:05.678997 | orchestrator | 2026-04-20 03:31:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:08.726051 | orchestrator | 2026-04-20 03:31:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:08.729513 | orchestrator | 2026-04-20 03:31:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:08.729572 | orchestrator | 2026-04-20 03:31:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:11.770872 | orchestrator | 2026-04-20 03:31:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:11.773732 | orchestrator | 2026-04-20 03:31:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:11.773793 | orchestrator | 2026-04-20 03:31:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:14.818124 | orchestrator | 2026-04-20 03:31:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:14.820346 | orchestrator | 2026-04-20 03:31:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:14.820450 | orchestrator | 2026-04-20 03:31:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:17.870565 | orchestrator | 2026-04-20 03:31:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:17.870880 | orchestrator | 2026-04-20 03:31:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:17.871076 | orchestrator | 2026-04-20 03:31:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:20.919757 | orchestrator | 2026-04-20 03:31:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:20.920848 | orchestrator | 2026-04-20 03:31:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:20.920892 | orchestrator | 2026-04-20 03:31:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:23.968354 | orchestrator | 2026-04-20 03:31:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:23.970617 | orchestrator | 2026-04-20 03:31:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:23.970709 | orchestrator | 2026-04-20 03:31:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:27.011285 | orchestrator | 2026-04-20 03:31:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:27.013126 | orchestrator | 2026-04-20 03:31:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:27.013179 | orchestrator | 2026-04-20 03:31:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:30.064863 | orchestrator | 2026-04-20 03:31:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:30.064942 | orchestrator | 2026-04-20 03:31:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:30.064949 | orchestrator | 2026-04-20 03:31:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:33.106836 | orchestrator | 2026-04-20 03:31:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:33.107432 | orchestrator | 2026-04-20 03:31:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:33.107478 | orchestrator | 2026-04-20 03:31:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:36.151322 | orchestrator | 2026-04-20 03:31:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:36.153345 | orchestrator | 2026-04-20 03:31:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:36.153388 | orchestrator | 2026-04-20 03:31:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:39.198186 | orchestrator | 2026-04-20 03:31:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:39.199910 | orchestrator | 2026-04-20 03:31:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:39.199965 | orchestrator | 2026-04-20 03:31:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:42.246543 | orchestrator | 2026-04-20 03:31:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:42.247017 | orchestrator | 2026-04-20 03:31:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:42.247059 | orchestrator | 2026-04-20 03:31:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:45.293482 | orchestrator | 2026-04-20 03:31:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:45.295741 | orchestrator | 2026-04-20 03:31:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:45.295971 | orchestrator | 2026-04-20 03:31:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:48.343850 | orchestrator | 2026-04-20 03:31:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:48.346939 | orchestrator | 2026-04-20 03:31:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:48.347013 | orchestrator | 2026-04-20 03:31:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:51.386150 | orchestrator | 2026-04-20 03:31:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:51.387697 | orchestrator | 2026-04-20 03:31:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:51.387780 | orchestrator | 2026-04-20 03:31:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:54.435819 | orchestrator | 2026-04-20 03:31:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:54.436917 | orchestrator | 2026-04-20 03:31:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:54.436967 | orchestrator | 2026-04-20 03:31:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:31:57.486985 | orchestrator | 2026-04-20 03:31:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:31:57.488165 | orchestrator | 2026-04-20 03:31:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:31:57.488208 | orchestrator | 2026-04-20 03:31:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:00.533193 | orchestrator | 2026-04-20 03:32:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:00.534691 | orchestrator | 2026-04-20 03:32:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:00.534737 | orchestrator | 2026-04-20 03:32:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:03.570598 | orchestrator | 2026-04-20 03:32:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:03.571767 | orchestrator | 2026-04-20 03:32:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:03.571804 | orchestrator | 2026-04-20 03:32:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:06.613262 | orchestrator | 2026-04-20 03:32:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:06.614687 | orchestrator | 2026-04-20 03:32:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:06.614807 | orchestrator | 2026-04-20 03:32:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:09.658574 | orchestrator | 2026-04-20 03:32:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:09.659663 | orchestrator | 2026-04-20 03:32:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:09.659705 | orchestrator | 2026-04-20 03:32:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:12.704441 | orchestrator | 2026-04-20 03:32:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:12.705277 | orchestrator | 2026-04-20 03:32:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:12.705457 | orchestrator | 2026-04-20 03:32:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:15.751083 | orchestrator | 2026-04-20 03:32:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:15.755073 | orchestrator | 2026-04-20 03:32:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:15.755123 | orchestrator | 2026-04-20 03:32:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:18.790562 | orchestrator | 2026-04-20 03:32:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:18.792952 | orchestrator | 2026-04-20 03:32:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:18.792999 | orchestrator | 2026-04-20 03:32:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:21.827254 | orchestrator | 2026-04-20 03:32:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:21.828342 | orchestrator | 2026-04-20 03:32:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:21.828388 | orchestrator | 2026-04-20 03:32:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:24.870166 | orchestrator | 2026-04-20 03:32:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:24.870565 | orchestrator | 2026-04-20 03:32:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:24.870601 | orchestrator | 2026-04-20 03:32:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:27.917594 | orchestrator | 2026-04-20 03:32:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:27.919726 | orchestrator | 2026-04-20 03:32:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:27.919783 | orchestrator | 2026-04-20 03:32:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:30.965871 | orchestrator | 2026-04-20 03:32:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:30.966638 | orchestrator | 2026-04-20 03:32:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:30.966685 | orchestrator | 2026-04-20 03:32:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:34.019072 | orchestrator | 2026-04-20 03:32:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:34.019167 | orchestrator | 2026-04-20 03:32:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:34.019287 | orchestrator | 2026-04-20 03:32:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:37.061263 | orchestrator | 2026-04-20 03:32:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:37.063659 | orchestrator | 2026-04-20 03:32:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:37.063738 | orchestrator | 2026-04-20 03:32:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:40.107141 | orchestrator | 2026-04-20 03:32:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:40.107319 | orchestrator | 2026-04-20 03:32:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:40.107335 | orchestrator | 2026-04-20 03:32:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:43.153412 | orchestrator | 2026-04-20 03:32:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:43.153852 | orchestrator | 2026-04-20 03:32:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:43.154443 | orchestrator | 2026-04-20 03:32:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:46.203221 | orchestrator | 2026-04-20 03:32:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:46.204409 | orchestrator | 2026-04-20 03:32:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:46.204490 | orchestrator | 2026-04-20 03:32:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:49.247921 | orchestrator | 2026-04-20 03:32:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:49.248386 | orchestrator | 2026-04-20 03:32:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:49.248413 | orchestrator | 2026-04-20 03:32:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:52.298408 | orchestrator | 2026-04-20 03:32:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:52.299871 | orchestrator | 2026-04-20 03:32:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:52.299927 | orchestrator | 2026-04-20 03:32:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:55.346497 | orchestrator | 2026-04-20 03:32:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:55.348255 | orchestrator | 2026-04-20 03:32:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:55.348295 | orchestrator | 2026-04-20 03:32:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:32:58.394850 | orchestrator | 2026-04-20 03:32:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:32:58.395366 | orchestrator | 2026-04-20 03:32:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:32:58.395405 | orchestrator | 2026-04-20 03:32:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:01.438837 | orchestrator | 2026-04-20 03:33:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:01.440759 | orchestrator | 2026-04-20 03:33:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:01.440813 | orchestrator | 2026-04-20 03:33:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:04.489271 | orchestrator | 2026-04-20 03:33:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:04.490886 | orchestrator | 2026-04-20 03:33:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:04.490996 | orchestrator | 2026-04-20 03:33:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:07.544540 | orchestrator | 2026-04-20 03:33:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:07.548158 | orchestrator | 2026-04-20 03:33:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:07.548230 | orchestrator | 2026-04-20 03:33:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:10.604529 | orchestrator | 2026-04-20 03:33:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:10.606195 | orchestrator | 2026-04-20 03:33:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:10.606300 | orchestrator | 2026-04-20 03:33:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:13.651316 | orchestrator | 2026-04-20 03:33:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:13.652706 | orchestrator | 2026-04-20 03:33:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:13.652900 | orchestrator | 2026-04-20 03:33:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:16.698299 | orchestrator | 2026-04-20 03:33:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:16.701057 | orchestrator | 2026-04-20 03:33:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:16.701100 | orchestrator | 2026-04-20 03:33:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:19.752442 | orchestrator | 2026-04-20 03:33:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:19.754866 | orchestrator | 2026-04-20 03:33:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:19.754951 | orchestrator | 2026-04-20 03:33:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:22.803010 | orchestrator | 2026-04-20 03:33:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:22.803760 | orchestrator | 2026-04-20 03:33:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:22.803788 | orchestrator | 2026-04-20 03:33:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:25.853297 | orchestrator | 2026-04-20 03:33:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:25.854357 | orchestrator | 2026-04-20 03:33:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:25.854389 | orchestrator | 2026-04-20 03:33:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:28.897908 | orchestrator | 2026-04-20 03:33:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:28.898630 | orchestrator | 2026-04-20 03:33:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:28.898733 | orchestrator | 2026-04-20 03:33:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:31.946763 | orchestrator | 2026-04-20 03:33:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:31.949823 | orchestrator | 2026-04-20 03:33:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:31.949869 | orchestrator | 2026-04-20 03:33:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:34.998577 | orchestrator | 2026-04-20 03:33:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:34.999783 | orchestrator | 2026-04-20 03:33:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:34.999872 | orchestrator | 2026-04-20 03:33:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:38.051621 | orchestrator | 2026-04-20 03:33:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:38.051985 | orchestrator | 2026-04-20 03:33:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:38.052393 | orchestrator | 2026-04-20 03:33:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:41.097859 | orchestrator | 2026-04-20 03:33:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:41.099248 | orchestrator | 2026-04-20 03:33:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:41.099302 | orchestrator | 2026-04-20 03:33:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:44.145408 | orchestrator | 2026-04-20 03:33:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:44.146624 | orchestrator | 2026-04-20 03:33:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:44.147066 | orchestrator | 2026-04-20 03:33:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:47.196457 | orchestrator | 2026-04-20 03:33:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:47.197265 | orchestrator | 2026-04-20 03:33:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:47.197351 | orchestrator | 2026-04-20 03:33:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:50.244516 | orchestrator | 2026-04-20 03:33:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:50.246510 | orchestrator | 2026-04-20 03:33:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:50.246583 | orchestrator | 2026-04-20 03:33:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:53.296335 | orchestrator | 2026-04-20 03:33:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:53.298648 | orchestrator | 2026-04-20 03:33:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:53.298730 | orchestrator | 2026-04-20 03:33:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:56.348069 | orchestrator | 2026-04-20 03:33:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:56.349437 | orchestrator | 2026-04-20 03:33:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:56.349500 | orchestrator | 2026-04-20 03:33:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:33:59.397856 | orchestrator | 2026-04-20 03:33:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:33:59.398708 | orchestrator | 2026-04-20 03:33:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:33:59.398790 | orchestrator | 2026-04-20 03:33:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:02.438135 | orchestrator | 2026-04-20 03:34:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:02.439976 | orchestrator | 2026-04-20 03:34:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:02.440041 | orchestrator | 2026-04-20 03:34:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:05.486991 | orchestrator | 2026-04-20 03:34:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:05.489392 | orchestrator | 2026-04-20 03:34:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:05.489488 | orchestrator | 2026-04-20 03:34:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:08.532716 | orchestrator | 2026-04-20 03:34:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:08.533539 | orchestrator | 2026-04-20 03:34:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:08.533585 | orchestrator | 2026-04-20 03:34:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:11.578481 | orchestrator | 2026-04-20 03:34:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:11.581295 | orchestrator | 2026-04-20 03:34:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:11.581405 | orchestrator | 2026-04-20 03:34:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:14.619295 | orchestrator | 2026-04-20 03:34:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:14.621742 | orchestrator | 2026-04-20 03:34:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:14.621909 | orchestrator | 2026-04-20 03:34:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:17.669397 | orchestrator | 2026-04-20 03:34:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:17.670548 | orchestrator | 2026-04-20 03:34:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:17.671922 | orchestrator | 2026-04-20 03:34:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:20.719987 | orchestrator | 2026-04-20 03:34:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:20.721155 | orchestrator | 2026-04-20 03:34:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:20.721205 | orchestrator | 2026-04-20 03:34:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:23.776928 | orchestrator | 2026-04-20 03:34:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:23.779859 | orchestrator | 2026-04-20 03:34:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:23.779985 | orchestrator | 2026-04-20 03:34:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:26.828658 | orchestrator | 2026-04-20 03:34:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:26.829865 | orchestrator | 2026-04-20 03:34:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:26.829910 | orchestrator | 2026-04-20 03:34:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:29.876232 | orchestrator | 2026-04-20 03:34:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:29.877605 | orchestrator | 2026-04-20 03:34:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:29.877658 | orchestrator | 2026-04-20 03:34:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:32.915615 | orchestrator | 2026-04-20 03:34:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:32.919107 | orchestrator | 2026-04-20 03:34:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:32.919304 | orchestrator | 2026-04-20 03:34:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:35.968222 | orchestrator | 2026-04-20 03:34:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:35.970526 | orchestrator | 2026-04-20 03:34:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:35.970599 | orchestrator | 2026-04-20 03:34:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:39.019442 | orchestrator | 2026-04-20 03:34:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:39.020166 | orchestrator | 2026-04-20 03:34:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:39.020229 | orchestrator | 2026-04-20 03:34:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:42.064575 | orchestrator | 2026-04-20 03:34:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:42.065203 | orchestrator | 2026-04-20 03:34:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:42.065251 | orchestrator | 2026-04-20 03:34:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:45.113311 | orchestrator | 2026-04-20 03:34:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:45.118226 | orchestrator | 2026-04-20 03:34:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:45.118349 | orchestrator | 2026-04-20 03:34:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:48.176090 | orchestrator | 2026-04-20 03:34:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:48.178009 | orchestrator | 2026-04-20 03:34:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:48.178238 | orchestrator | 2026-04-20 03:34:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:51.229900 | orchestrator | 2026-04-20 03:34:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:51.233035 | orchestrator | 2026-04-20 03:34:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:51.233172 | orchestrator | 2026-04-20 03:34:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:54.285933 | orchestrator | 2026-04-20 03:34:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:54.288207 | orchestrator | 2026-04-20 03:34:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:54.288288 | orchestrator | 2026-04-20 03:34:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:34:57.340680 | orchestrator | 2026-04-20 03:34:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:34:57.341735 | orchestrator | 2026-04-20 03:34:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:34:57.341840 | orchestrator | 2026-04-20 03:34:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:00.393046 | orchestrator | 2026-04-20 03:35:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:00.394068 | orchestrator | 2026-04-20 03:35:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:00.394097 | orchestrator | 2026-04-20 03:35:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:03.441282 | orchestrator | 2026-04-20 03:35:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:03.442366 | orchestrator | 2026-04-20 03:35:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:03.442433 | orchestrator | 2026-04-20 03:35:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:06.481236 | orchestrator | 2026-04-20 03:35:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:06.482257 | orchestrator | 2026-04-20 03:35:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:06.482392 | orchestrator | 2026-04-20 03:35:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:09.524990 | orchestrator | 2026-04-20 03:35:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:09.527892 | orchestrator | 2026-04-20 03:35:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:09.527979 | orchestrator | 2026-04-20 03:35:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:12.585453 | orchestrator | 2026-04-20 03:35:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:12.589196 | orchestrator | 2026-04-20 03:35:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:12.589263 | orchestrator | 2026-04-20 03:35:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:15.638241 | orchestrator | 2026-04-20 03:35:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:15.640519 | orchestrator | 2026-04-20 03:35:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:15.640569 | orchestrator | 2026-04-20 03:35:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:18.680535 | orchestrator | 2026-04-20 03:35:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:18.680640 | orchestrator | 2026-04-20 03:35:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:18.680649 | orchestrator | 2026-04-20 03:35:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:21.727049 | orchestrator | 2026-04-20 03:35:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:21.729725 | orchestrator | 2026-04-20 03:35:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:21.729765 | orchestrator | 2026-04-20 03:35:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:24.781205 | orchestrator | 2026-04-20 03:35:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:24.782648 | orchestrator | 2026-04-20 03:35:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:24.782677 | orchestrator | 2026-04-20 03:35:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:27.834184 | orchestrator | 2026-04-20 03:35:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:27.836015 | orchestrator | 2026-04-20 03:35:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:27.836097 | orchestrator | 2026-04-20 03:35:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:30.884313 | orchestrator | 2026-04-20 03:35:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:30.886111 | orchestrator | 2026-04-20 03:35:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:30.886160 | orchestrator | 2026-04-20 03:35:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:33.933739 | orchestrator | 2026-04-20 03:35:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:33.935389 | orchestrator | 2026-04-20 03:35:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:33.935472 | orchestrator | 2026-04-20 03:35:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:36.984498 | orchestrator | 2026-04-20 03:35:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:36.987578 | orchestrator | 2026-04-20 03:35:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:36.987678 | orchestrator | 2026-04-20 03:35:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:40.024565 | orchestrator | 2026-04-20 03:35:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:40.027245 | orchestrator | 2026-04-20 03:35:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:40.027340 | orchestrator | 2026-04-20 03:35:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:43.073803 | orchestrator | 2026-04-20 03:35:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:43.075970 | orchestrator | 2026-04-20 03:35:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:43.076026 | orchestrator | 2026-04-20 03:35:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:46.118170 | orchestrator | 2026-04-20 03:35:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:46.119638 | orchestrator | 2026-04-20 03:35:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:46.119737 | orchestrator | 2026-04-20 03:35:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:49.173385 | orchestrator | 2026-04-20 03:35:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:49.175402 | orchestrator | 2026-04-20 03:35:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:49.175483 | orchestrator | 2026-04-20 03:35:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:52.221286 | orchestrator | 2026-04-20 03:35:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:52.224198 | orchestrator | 2026-04-20 03:35:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:52.224252 | orchestrator | 2026-04-20 03:35:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:55.263367 | orchestrator | 2026-04-20 03:35:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:55.265423 | orchestrator | 2026-04-20 03:35:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:55.265542 | orchestrator | 2026-04-20 03:35:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:35:58.308046 | orchestrator | 2026-04-20 03:35:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:35:58.310086 | orchestrator | 2026-04-20 03:35:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:35:58.310124 | orchestrator | 2026-04-20 03:35:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:01.364659 | orchestrator | 2026-04-20 03:36:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:01.366597 | orchestrator | 2026-04-20 03:36:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:01.366729 | orchestrator | 2026-04-20 03:36:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:04.414124 | orchestrator | 2026-04-20 03:36:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:04.416818 | orchestrator | 2026-04-20 03:36:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:04.416875 | orchestrator | 2026-04-20 03:36:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:07.467150 | orchestrator | 2026-04-20 03:36:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:07.470221 | orchestrator | 2026-04-20 03:36:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:07.470270 | orchestrator | 2026-04-20 03:36:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:10.516710 | orchestrator | 2026-04-20 03:36:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:10.517820 | orchestrator | 2026-04-20 03:36:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:10.518052 | orchestrator | 2026-04-20 03:36:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:13.569862 | orchestrator | 2026-04-20 03:36:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:13.573469 | orchestrator | 2026-04-20 03:36:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:13.573591 | orchestrator | 2026-04-20 03:36:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:16.624906 | orchestrator | 2026-04-20 03:36:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:16.626599 | orchestrator | 2026-04-20 03:36:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:16.626759 | orchestrator | 2026-04-20 03:36:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:19.669062 | orchestrator | 2026-04-20 03:36:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:19.671641 | orchestrator | 2026-04-20 03:36:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:19.671702 | orchestrator | 2026-04-20 03:36:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:22.724191 | orchestrator | 2026-04-20 03:36:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:22.727658 | orchestrator | 2026-04-20 03:36:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:22.727761 | orchestrator | 2026-04-20 03:36:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:25.772310 | orchestrator | 2026-04-20 03:36:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:25.775512 | orchestrator | 2026-04-20 03:36:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:25.775826 | orchestrator | 2026-04-20 03:36:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:28.826697 | orchestrator | 2026-04-20 03:36:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:28.827523 | orchestrator | 2026-04-20 03:36:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:28.827550 | orchestrator | 2026-04-20 03:36:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:31.872962 | orchestrator | 2026-04-20 03:36:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:31.876020 | orchestrator | 2026-04-20 03:36:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:31.876105 | orchestrator | 2026-04-20 03:36:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:34.923291 | orchestrator | 2026-04-20 03:36:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:34.925871 | orchestrator | 2026-04-20 03:36:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:34.925953 | orchestrator | 2026-04-20 03:36:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:37.969052 | orchestrator | 2026-04-20 03:36:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:37.969804 | orchestrator | 2026-04-20 03:36:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:37.969947 | orchestrator | 2026-04-20 03:36:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:41.025715 | orchestrator | 2026-04-20 03:36:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:41.027652 | orchestrator | 2026-04-20 03:36:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:41.027717 | orchestrator | 2026-04-20 03:36:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:44.081010 | orchestrator | 2026-04-20 03:36:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:44.083162 | orchestrator | 2026-04-20 03:36:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:44.083224 | orchestrator | 2026-04-20 03:36:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:47.131454 | orchestrator | 2026-04-20 03:36:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:47.133091 | orchestrator | 2026-04-20 03:36:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:47.133141 | orchestrator | 2026-04-20 03:36:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:50.179616 | orchestrator | 2026-04-20 03:36:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:50.181524 | orchestrator | 2026-04-20 03:36:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:50.181570 | orchestrator | 2026-04-20 03:36:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:53.236967 | orchestrator | 2026-04-20 03:36:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:53.239894 | orchestrator | 2026-04-20 03:36:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:53.240004 | orchestrator | 2026-04-20 03:36:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:56.284792 | orchestrator | 2026-04-20 03:36:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:56.285722 | orchestrator | 2026-04-20 03:36:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:56.285798 | orchestrator | 2026-04-20 03:36:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:36:59.334473 | orchestrator | 2026-04-20 03:36:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:36:59.335366 | orchestrator | 2026-04-20 03:36:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:36:59.335425 | orchestrator | 2026-04-20 03:36:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:02.376841 | orchestrator | 2026-04-20 03:37:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:02.379325 | orchestrator | 2026-04-20 03:37:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:02.379423 | orchestrator | 2026-04-20 03:37:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:05.420418 | orchestrator | 2026-04-20 03:37:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:05.421379 | orchestrator | 2026-04-20 03:37:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:05.421439 | orchestrator | 2026-04-20 03:37:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:08.464613 | orchestrator | 2026-04-20 03:37:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:08.465762 | orchestrator | 2026-04-20 03:37:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:08.465803 | orchestrator | 2026-04-20 03:37:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:11.506604 | orchestrator | 2026-04-20 03:37:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:11.508588 | orchestrator | 2026-04-20 03:37:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:11.508627 | orchestrator | 2026-04-20 03:37:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:14.553457 | orchestrator | 2026-04-20 03:37:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:14.556285 | orchestrator | 2026-04-20 03:37:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:14.556353 | orchestrator | 2026-04-20 03:37:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:17.604761 | orchestrator | 2026-04-20 03:37:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:17.606222 | orchestrator | 2026-04-20 03:37:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:17.606407 | orchestrator | 2026-04-20 03:37:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:20.656399 | orchestrator | 2026-04-20 03:37:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:20.658952 | orchestrator | 2026-04-20 03:37:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:20.659062 | orchestrator | 2026-04-20 03:37:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:23.712186 | orchestrator | 2026-04-20 03:37:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:23.713835 | orchestrator | 2026-04-20 03:37:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:23.713920 | orchestrator | 2026-04-20 03:37:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:26.765496 | orchestrator | 2026-04-20 03:37:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:26.767599 | orchestrator | 2026-04-20 03:37:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:26.767804 | orchestrator | 2026-04-20 03:37:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:29.826639 | orchestrator | 2026-04-20 03:37:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:29.828400 | orchestrator | 2026-04-20 03:37:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:29.828682 | orchestrator | 2026-04-20 03:37:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:32.873148 | orchestrator | 2026-04-20 03:37:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:32.874715 | orchestrator | 2026-04-20 03:37:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:32.874806 | orchestrator | 2026-04-20 03:37:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:35.924190 | orchestrator | 2026-04-20 03:37:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:35.926525 | orchestrator | 2026-04-20 03:37:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:35.926610 | orchestrator | 2026-04-20 03:37:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:38.972968 | orchestrator | 2026-04-20 03:37:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:38.974170 | orchestrator | 2026-04-20 03:37:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:38.974207 | orchestrator | 2026-04-20 03:37:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:42.034967 | orchestrator | 2026-04-20 03:37:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:42.036671 | orchestrator | 2026-04-20 03:37:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:42.036732 | orchestrator | 2026-04-20 03:37:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:45.079360 | orchestrator | 2026-04-20 03:37:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:45.081545 | orchestrator | 2026-04-20 03:37:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:45.081624 | orchestrator | 2026-04-20 03:37:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:48.118518 | orchestrator | 2026-04-20 03:37:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:48.119386 | orchestrator | 2026-04-20 03:37:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:48.119421 | orchestrator | 2026-04-20 03:37:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:51.170209 | orchestrator | 2026-04-20 03:37:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:51.171659 | orchestrator | 2026-04-20 03:37:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:51.171740 | orchestrator | 2026-04-20 03:37:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:54.223390 | orchestrator | 2026-04-20 03:37:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:54.225700 | orchestrator | 2026-04-20 03:37:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:54.225970 | orchestrator | 2026-04-20 03:37:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:37:57.276552 | orchestrator | 2026-04-20 03:37:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:37:57.280670 | orchestrator | 2026-04-20 03:37:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:37:57.280777 | orchestrator | 2026-04-20 03:37:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:00.336513 | orchestrator | 2026-04-20 03:38:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:00.338487 | orchestrator | 2026-04-20 03:38:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:00.338531 | orchestrator | 2026-04-20 03:38:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:03.391278 | orchestrator | 2026-04-20 03:38:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:03.392522 | orchestrator | 2026-04-20 03:38:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:03.392571 | orchestrator | 2026-04-20 03:38:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:06.441111 | orchestrator | 2026-04-20 03:38:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:06.442412 | orchestrator | 2026-04-20 03:38:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:06.442466 | orchestrator | 2026-04-20 03:38:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:09.488005 | orchestrator | 2026-04-20 03:38:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:09.490649 | orchestrator | 2026-04-20 03:38:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:09.490785 | orchestrator | 2026-04-20 03:38:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:12.536532 | orchestrator | 2026-04-20 03:38:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:12.539356 | orchestrator | 2026-04-20 03:38:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:12.539442 | orchestrator | 2026-04-20 03:38:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:15.584135 | orchestrator | 2026-04-20 03:38:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:15.585725 | orchestrator | 2026-04-20 03:38:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:15.585781 | orchestrator | 2026-04-20 03:38:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:18.638591 | orchestrator | 2026-04-20 03:38:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:18.639923 | orchestrator | 2026-04-20 03:38:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:18.639972 | orchestrator | 2026-04-20 03:38:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:21.673418 | orchestrator | 2026-04-20 03:38:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:21.674352 | orchestrator | 2026-04-20 03:38:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:21.674386 | orchestrator | 2026-04-20 03:38:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:24.716802 | orchestrator | 2026-04-20 03:38:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:24.718556 | orchestrator | 2026-04-20 03:38:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:24.718600 | orchestrator | 2026-04-20 03:38:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:27.766001 | orchestrator | 2026-04-20 03:38:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:27.767317 | orchestrator | 2026-04-20 03:38:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:27.767382 | orchestrator | 2026-04-20 03:38:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:30.818370 | orchestrator | 2026-04-20 03:38:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:30.820361 | orchestrator | 2026-04-20 03:38:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:30.820428 | orchestrator | 2026-04-20 03:38:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:33.864733 | orchestrator | 2026-04-20 03:38:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:33.865571 | orchestrator | 2026-04-20 03:38:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:33.865603 | orchestrator | 2026-04-20 03:38:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:36.917049 | orchestrator | 2026-04-20 03:38:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:36.918307 | orchestrator | 2026-04-20 03:38:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:36.918405 | orchestrator | 2026-04-20 03:38:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:39.968346 | orchestrator | 2026-04-20 03:38:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:39.970276 | orchestrator | 2026-04-20 03:38:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:39.970471 | orchestrator | 2026-04-20 03:38:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:43.018230 | orchestrator | 2026-04-20 03:38:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:43.019964 | orchestrator | 2026-04-20 03:38:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:43.020016 | orchestrator | 2026-04-20 03:38:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:46.062890 | orchestrator | 2026-04-20 03:38:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:46.065536 | orchestrator | 2026-04-20 03:38:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:46.066111 | orchestrator | 2026-04-20 03:38:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:49.118687 | orchestrator | 2026-04-20 03:38:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:49.121237 | orchestrator | 2026-04-20 03:38:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:49.121358 | orchestrator | 2026-04-20 03:38:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:52.165395 | orchestrator | 2026-04-20 03:38:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:52.166587 | orchestrator | 2026-04-20 03:38:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:52.166621 | orchestrator | 2026-04-20 03:38:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:55.213697 | orchestrator | 2026-04-20 03:38:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:55.215183 | orchestrator | 2026-04-20 03:38:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:55.215248 | orchestrator | 2026-04-20 03:38:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:38:58.265289 | orchestrator | 2026-04-20 03:38:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:38:58.267102 | orchestrator | 2026-04-20 03:38:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:38:58.267178 | orchestrator | 2026-04-20 03:38:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:01.308560 | orchestrator | 2026-04-20 03:39:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:01.310931 | orchestrator | 2026-04-20 03:39:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:01.311065 | orchestrator | 2026-04-20 03:39:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:04.360349 | orchestrator | 2026-04-20 03:39:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:04.364101 | orchestrator | 2026-04-20 03:39:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:04.364148 | orchestrator | 2026-04-20 03:39:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:07.407032 | orchestrator | 2026-04-20 03:39:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:07.409031 | orchestrator | 2026-04-20 03:39:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:07.409094 | orchestrator | 2026-04-20 03:39:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:10.460207 | orchestrator | 2026-04-20 03:39:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:10.461214 | orchestrator | 2026-04-20 03:39:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:10.461259 | orchestrator | 2026-04-20 03:39:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:13.514645 | orchestrator | 2026-04-20 03:39:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:13.514754 | orchestrator | 2026-04-20 03:39:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:13.514771 | orchestrator | 2026-04-20 03:39:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:16.550816 | orchestrator | 2026-04-20 03:39:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:16.553262 | orchestrator | 2026-04-20 03:39:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:16.553306 | orchestrator | 2026-04-20 03:39:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:19.608231 | orchestrator | 2026-04-20 03:39:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:19.608367 | orchestrator | 2026-04-20 03:39:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:19.608394 | orchestrator | 2026-04-20 03:39:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:22.658839 | orchestrator | 2026-04-20 03:39:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:22.661148 | orchestrator | 2026-04-20 03:39:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:22.661247 | orchestrator | 2026-04-20 03:39:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:25.709561 | orchestrator | 2026-04-20 03:39:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:25.711917 | orchestrator | 2026-04-20 03:39:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:25.711965 | orchestrator | 2026-04-20 03:39:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:28.759350 | orchestrator | 2026-04-20 03:39:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:28.760626 | orchestrator | 2026-04-20 03:39:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:28.760738 | orchestrator | 2026-04-20 03:39:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:31.814169 | orchestrator | 2026-04-20 03:39:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:31.817239 | orchestrator | 2026-04-20 03:39:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:31.817329 | orchestrator | 2026-04-20 03:39:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:34.864645 | orchestrator | 2026-04-20 03:39:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:34.867262 | orchestrator | 2026-04-20 03:39:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:34.867336 | orchestrator | 2026-04-20 03:39:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:37.915486 | orchestrator | 2026-04-20 03:39:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:37.916628 | orchestrator | 2026-04-20 03:39:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:37.916667 | orchestrator | 2026-04-20 03:39:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:40.957255 | orchestrator | 2026-04-20 03:39:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:40.958126 | orchestrator | 2026-04-20 03:39:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:40.958169 | orchestrator | 2026-04-20 03:39:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:44.011389 | orchestrator | 2026-04-20 03:39:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:44.016294 | orchestrator | 2026-04-20 03:39:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:44.016392 | orchestrator | 2026-04-20 03:39:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:47.065643 | orchestrator | 2026-04-20 03:39:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:47.067166 | orchestrator | 2026-04-20 03:39:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:47.067231 | orchestrator | 2026-04-20 03:39:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:50.115716 | orchestrator | 2026-04-20 03:39:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:50.117547 | orchestrator | 2026-04-20 03:39:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:50.117785 | orchestrator | 2026-04-20 03:39:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:53.165894 | orchestrator | 2026-04-20 03:39:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:53.167706 | orchestrator | 2026-04-20 03:39:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:53.167784 | orchestrator | 2026-04-20 03:39:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:56.214564 | orchestrator | 2026-04-20 03:39:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:56.216343 | orchestrator | 2026-04-20 03:39:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:56.216377 | orchestrator | 2026-04-20 03:39:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:39:59.264769 | orchestrator | 2026-04-20 03:39:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:39:59.266098 | orchestrator | 2026-04-20 03:39:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:39:59.266149 | orchestrator | 2026-04-20 03:39:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:02.309525 | orchestrator | 2026-04-20 03:40:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:02.309627 | orchestrator | 2026-04-20 03:40:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:02.309640 | orchestrator | 2026-04-20 03:40:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:05.355247 | orchestrator | 2026-04-20 03:40:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:05.356040 | orchestrator | 2026-04-20 03:40:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:05.356066 | orchestrator | 2026-04-20 03:40:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:08.398640 | orchestrator | 2026-04-20 03:40:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:08.399552 | orchestrator | 2026-04-20 03:40:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:08.399592 | orchestrator | 2026-04-20 03:40:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:11.440399 | orchestrator | 2026-04-20 03:40:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:11.442309 | orchestrator | 2026-04-20 03:40:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:11.442403 | orchestrator | 2026-04-20 03:40:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:14.482155 | orchestrator | 2026-04-20 03:40:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:14.483159 | orchestrator | 2026-04-20 03:40:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:14.483193 | orchestrator | 2026-04-20 03:40:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:17.532992 | orchestrator | 2026-04-20 03:40:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:17.536182 | orchestrator | 2026-04-20 03:40:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:17.536268 | orchestrator | 2026-04-20 03:40:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:20.583382 | orchestrator | 2026-04-20 03:40:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:20.584448 | orchestrator | 2026-04-20 03:40:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:20.584501 | orchestrator | 2026-04-20 03:40:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:23.631558 | orchestrator | 2026-04-20 03:40:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:23.635586 | orchestrator | 2026-04-20 03:40:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:23.636350 | orchestrator | 2026-04-20 03:40:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:26.683601 | orchestrator | 2026-04-20 03:40:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:26.686099 | orchestrator | 2026-04-20 03:40:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:26.686245 | orchestrator | 2026-04-20 03:40:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:29.743025 | orchestrator | 2026-04-20 03:40:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:29.744114 | orchestrator | 2026-04-20 03:40:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:29.744258 | orchestrator | 2026-04-20 03:40:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:32.785403 | orchestrator | 2026-04-20 03:40:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:32.787790 | orchestrator | 2026-04-20 03:40:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:32.788074 | orchestrator | 2026-04-20 03:40:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:35.835712 | orchestrator | 2026-04-20 03:40:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:35.836677 | orchestrator | 2026-04-20 03:40:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:35.836697 | orchestrator | 2026-04-20 03:40:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:38.885216 | orchestrator | 2026-04-20 03:40:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:38.887384 | orchestrator | 2026-04-20 03:40:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:38.887444 | orchestrator | 2026-04-20 03:40:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:41.936295 | orchestrator | 2026-04-20 03:40:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:41.936518 | orchestrator | 2026-04-20 03:40:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:41.937300 | orchestrator | 2026-04-20 03:40:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:44.981537 | orchestrator | 2026-04-20 03:40:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:44.984092 | orchestrator | 2026-04-20 03:40:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:44.984225 | orchestrator | 2026-04-20 03:40:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:48.030008 | orchestrator | 2026-04-20 03:40:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:48.030812 | orchestrator | 2026-04-20 03:40:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:48.030878 | orchestrator | 2026-04-20 03:40:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:51.073006 | orchestrator | 2026-04-20 03:40:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:51.074359 | orchestrator | 2026-04-20 03:40:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:51.074408 | orchestrator | 2026-04-20 03:40:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:54.121817 | orchestrator | 2026-04-20 03:40:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:54.123690 | orchestrator | 2026-04-20 03:40:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:54.123809 | orchestrator | 2026-04-20 03:40:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:40:57.172774 | orchestrator | 2026-04-20 03:40:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:40:57.174570 | orchestrator | 2026-04-20 03:40:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:40:57.174718 | orchestrator | 2026-04-20 03:40:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:00.219335 | orchestrator | 2026-04-20 03:41:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:00.220967 | orchestrator | 2026-04-20 03:41:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:00.221014 | orchestrator | 2026-04-20 03:41:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:03.269386 | orchestrator | 2026-04-20 03:41:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:03.271070 | orchestrator | 2026-04-20 03:41:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:03.271121 | orchestrator | 2026-04-20 03:41:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:06.322538 | orchestrator | 2026-04-20 03:41:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:06.325151 | orchestrator | 2026-04-20 03:41:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:06.325198 | orchestrator | 2026-04-20 03:41:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:09.369201 | orchestrator | 2026-04-20 03:41:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:09.371055 | orchestrator | 2026-04-20 03:41:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:09.371174 | orchestrator | 2026-04-20 03:41:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:12.416711 | orchestrator | 2026-04-20 03:41:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:12.419865 | orchestrator | 2026-04-20 03:41:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:12.420157 | orchestrator | 2026-04-20 03:41:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:15.468869 | orchestrator | 2026-04-20 03:41:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:15.471082 | orchestrator | 2026-04-20 03:41:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:15.471149 | orchestrator | 2026-04-20 03:41:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:18.519514 | orchestrator | 2026-04-20 03:41:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:18.520741 | orchestrator | 2026-04-20 03:41:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:18.521158 | orchestrator | 2026-04-20 03:41:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:21.565839 | orchestrator | 2026-04-20 03:41:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:21.568342 | orchestrator | 2026-04-20 03:41:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:21.568451 | orchestrator | 2026-04-20 03:41:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:24.602246 | orchestrator | 2026-04-20 03:41:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:24.602545 | orchestrator | 2026-04-20 03:41:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:24.602576 | orchestrator | 2026-04-20 03:41:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:27.644631 | orchestrator | 2026-04-20 03:41:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:27.647511 | orchestrator | 2026-04-20 03:41:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:27.647624 | orchestrator | 2026-04-20 03:41:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:30.697014 | orchestrator | 2026-04-20 03:41:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:30.698540 | orchestrator | 2026-04-20 03:41:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:30.698594 | orchestrator | 2026-04-20 03:41:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:33.754986 | orchestrator | 2026-04-20 03:41:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:33.756570 | orchestrator | 2026-04-20 03:41:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:33.756624 | orchestrator | 2026-04-20 03:41:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:36.818092 | orchestrator | 2026-04-20 03:41:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:36.819190 | orchestrator | 2026-04-20 03:41:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:36.819234 | orchestrator | 2026-04-20 03:41:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:39.871865 | orchestrator | 2026-04-20 03:41:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:39.874282 | orchestrator | 2026-04-20 03:41:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:39.874384 | orchestrator | 2026-04-20 03:41:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:42.923460 | orchestrator | 2026-04-20 03:41:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:42.925439 | orchestrator | 2026-04-20 03:41:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:42.925588 | orchestrator | 2026-04-20 03:41:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:45.976728 | orchestrator | 2026-04-20 03:41:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:45.977962 | orchestrator | 2026-04-20 03:41:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:45.978566 | orchestrator | 2026-04-20 03:41:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:49.033969 | orchestrator | 2026-04-20 03:41:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:49.035444 | orchestrator | 2026-04-20 03:41:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:49.035586 | orchestrator | 2026-04-20 03:41:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:52.079856 | orchestrator | 2026-04-20 03:41:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:52.081460 | orchestrator | 2026-04-20 03:41:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:52.081501 | orchestrator | 2026-04-20 03:41:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:55.117872 | orchestrator | 2026-04-20 03:41:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:55.118385 | orchestrator | 2026-04-20 03:41:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:55.118443 | orchestrator | 2026-04-20 03:41:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:41:58.154453 | orchestrator | 2026-04-20 03:41:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:41:58.155395 | orchestrator | 2026-04-20 03:41:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:41:58.155470 | orchestrator | 2026-04-20 03:41:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:01.187401 | orchestrator | 2026-04-20 03:42:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:01.188691 | orchestrator | 2026-04-20 03:42:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:01.188735 | orchestrator | 2026-04-20 03:42:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:04.234006 | orchestrator | 2026-04-20 03:42:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:04.235750 | orchestrator | 2026-04-20 03:42:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:04.235812 | orchestrator | 2026-04-20 03:42:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:07.286987 | orchestrator | 2026-04-20 03:42:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:07.287134 | orchestrator | 2026-04-20 03:42:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:07.287273 | orchestrator | 2026-04-20 03:42:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:10.335037 | orchestrator | 2026-04-20 03:42:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:10.336189 | orchestrator | 2026-04-20 03:42:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:10.336247 | orchestrator | 2026-04-20 03:42:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:13.386814 | orchestrator | 2026-04-20 03:42:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:13.388188 | orchestrator | 2026-04-20 03:42:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:13.388792 | orchestrator | 2026-04-20 03:42:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:16.438641 | orchestrator | 2026-04-20 03:42:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:16.440106 | orchestrator | 2026-04-20 03:42:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:16.440156 | orchestrator | 2026-04-20 03:42:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:19.490247 | orchestrator | 2026-04-20 03:42:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:19.492333 | orchestrator | 2026-04-20 03:42:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:19.492486 | orchestrator | 2026-04-20 03:42:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:22.540494 | orchestrator | 2026-04-20 03:42:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:22.541819 | orchestrator | 2026-04-20 03:42:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:22.541859 | orchestrator | 2026-04-20 03:42:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:25.590282 | orchestrator | 2026-04-20 03:42:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:25.592317 | orchestrator | 2026-04-20 03:42:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:25.592412 | orchestrator | 2026-04-20 03:42:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:28.640266 | orchestrator | 2026-04-20 03:42:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:28.641559 | orchestrator | 2026-04-20 03:42:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:28.641623 | orchestrator | 2026-04-20 03:42:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:31.690186 | orchestrator | 2026-04-20 03:42:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:31.692306 | orchestrator | 2026-04-20 03:42:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:31.692377 | orchestrator | 2026-04-20 03:42:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:34.740056 | orchestrator | 2026-04-20 03:42:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:34.741170 | orchestrator | 2026-04-20 03:42:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:34.741202 | orchestrator | 2026-04-20 03:42:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:37.785881 | orchestrator | 2026-04-20 03:42:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:37.787870 | orchestrator | 2026-04-20 03:42:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:37.787923 | orchestrator | 2026-04-20 03:42:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:40.829122 | orchestrator | 2026-04-20 03:42:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:40.830568 | orchestrator | 2026-04-20 03:42:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:40.830632 | orchestrator | 2026-04-20 03:42:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:43.876617 | orchestrator | 2026-04-20 03:42:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:43.878238 | orchestrator | 2026-04-20 03:42:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:43.878308 | orchestrator | 2026-04-20 03:42:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:46.929595 | orchestrator | 2026-04-20 03:42:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:46.932239 | orchestrator | 2026-04-20 03:42:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:46.932335 | orchestrator | 2026-04-20 03:42:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:49.979947 | orchestrator | 2026-04-20 03:42:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:49.981241 | orchestrator | 2026-04-20 03:42:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:49.981356 | orchestrator | 2026-04-20 03:42:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:53.029866 | orchestrator | 2026-04-20 03:42:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:53.031654 | orchestrator | 2026-04-20 03:42:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:53.031714 | orchestrator | 2026-04-20 03:42:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:56.077315 | orchestrator | 2026-04-20 03:42:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:56.078832 | orchestrator | 2026-04-20 03:42:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:56.078872 | orchestrator | 2026-04-20 03:42:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:42:59.131737 | orchestrator | 2026-04-20 03:42:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:42:59.133298 | orchestrator | 2026-04-20 03:42:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:42:59.133345 | orchestrator | 2026-04-20 03:42:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:02.184024 | orchestrator | 2026-04-20 03:43:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:02.187014 | orchestrator | 2026-04-20 03:43:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:02.187163 | orchestrator | 2026-04-20 03:43:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:05.238795 | orchestrator | 2026-04-20 03:43:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:05.240123 | orchestrator | 2026-04-20 03:43:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:05.240301 | orchestrator | 2026-04-20 03:43:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:08.286550 | orchestrator | 2026-04-20 03:43:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:08.288772 | orchestrator | 2026-04-20 03:43:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:08.288846 | orchestrator | 2026-04-20 03:43:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:11.341327 | orchestrator | 2026-04-20 03:43:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:11.344340 | orchestrator | 2026-04-20 03:43:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:11.344425 | orchestrator | 2026-04-20 03:43:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:14.381346 | orchestrator | 2026-04-20 03:43:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:14.382919 | orchestrator | 2026-04-20 03:43:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:14.382950 | orchestrator | 2026-04-20 03:43:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:17.426660 | orchestrator | 2026-04-20 03:43:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:17.427767 | orchestrator | 2026-04-20 03:43:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:17.427796 | orchestrator | 2026-04-20 03:43:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:20.471433 | orchestrator | 2026-04-20 03:43:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:20.473042 | orchestrator | 2026-04-20 03:43:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:20.473086 | orchestrator | 2026-04-20 03:43:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:23.520221 | orchestrator | 2026-04-20 03:43:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:23.522095 | orchestrator | 2026-04-20 03:43:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:23.522230 | orchestrator | 2026-04-20 03:43:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:26.567907 | orchestrator | 2026-04-20 03:43:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:26.568116 | orchestrator | 2026-04-20 03:43:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:26.568135 | orchestrator | 2026-04-20 03:43:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:29.615380 | orchestrator | 2026-04-20 03:43:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:29.619280 | orchestrator | 2026-04-20 03:43:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:29.619354 | orchestrator | 2026-04-20 03:43:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:32.665467 | orchestrator | 2026-04-20 03:43:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:32.669791 | orchestrator | 2026-04-20 03:43:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:32.669919 | orchestrator | 2026-04-20 03:43:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:35.717886 | orchestrator | 2026-04-20 03:43:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:35.720012 | orchestrator | 2026-04-20 03:43:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:35.720096 | orchestrator | 2026-04-20 03:43:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:38.769958 | orchestrator | 2026-04-20 03:43:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:38.771082 | orchestrator | 2026-04-20 03:43:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:38.771133 | orchestrator | 2026-04-20 03:43:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:41.823966 | orchestrator | 2026-04-20 03:43:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:41.825858 | orchestrator | 2026-04-20 03:43:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:41.825907 | orchestrator | 2026-04-20 03:43:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:44.878954 | orchestrator | 2026-04-20 03:43:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:44.880283 | orchestrator | 2026-04-20 03:43:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:44.880965 | orchestrator | 2026-04-20 03:43:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:47.932855 | orchestrator | 2026-04-20 03:43:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:47.935121 | orchestrator | 2026-04-20 03:43:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:47.935272 | orchestrator | 2026-04-20 03:43:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:50.975467 | orchestrator | 2026-04-20 03:43:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:50.976267 | orchestrator | 2026-04-20 03:43:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:50.976296 | orchestrator | 2026-04-20 03:43:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:54.020975 | orchestrator | 2026-04-20 03:43:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:54.021613 | orchestrator | 2026-04-20 03:43:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:54.021664 | orchestrator | 2026-04-20 03:43:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:43:57.064983 | orchestrator | 2026-04-20 03:43:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:43:57.067009 | orchestrator | 2026-04-20 03:43:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:43:57.067168 | orchestrator | 2026-04-20 03:43:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:00.110364 | orchestrator | 2026-04-20 03:44:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:00.111309 | orchestrator | 2026-04-20 03:44:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:00.111333 | orchestrator | 2026-04-20 03:44:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:03.159899 | orchestrator | 2026-04-20 03:44:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:03.161487 | orchestrator | 2026-04-20 03:44:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:03.161550 | orchestrator | 2026-04-20 03:44:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:06.207979 | orchestrator | 2026-04-20 03:44:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:06.210243 | orchestrator | 2026-04-20 03:44:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:06.210287 | orchestrator | 2026-04-20 03:44:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:09.264894 | orchestrator | 2026-04-20 03:44:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:09.265964 | orchestrator | 2026-04-20 03:44:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:09.266065 | orchestrator | 2026-04-20 03:44:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:12.317254 | orchestrator | 2026-04-20 03:44:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:12.320640 | orchestrator | 2026-04-20 03:44:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:12.320712 | orchestrator | 2026-04-20 03:44:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:15.371705 | orchestrator | 2026-04-20 03:44:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:15.373868 | orchestrator | 2026-04-20 03:44:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:15.373944 | orchestrator | 2026-04-20 03:44:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:18.426755 | orchestrator | 2026-04-20 03:44:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:18.428441 | orchestrator | 2026-04-20 03:44:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:18.428466 | orchestrator | 2026-04-20 03:44:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:21.469915 | orchestrator | 2026-04-20 03:44:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:21.471497 | orchestrator | 2026-04-20 03:44:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:21.471631 | orchestrator | 2026-04-20 03:44:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:24.523110 | orchestrator | 2026-04-20 03:44:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:24.525774 | orchestrator | 2026-04-20 03:44:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:24.525851 | orchestrator | 2026-04-20 03:44:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:27.564700 | orchestrator | 2026-04-20 03:44:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:27.566293 | orchestrator | 2026-04-20 03:44:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:27.566355 | orchestrator | 2026-04-20 03:44:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:30.610400 | orchestrator | 2026-04-20 03:44:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:30.611707 | orchestrator | 2026-04-20 03:44:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:30.611727 | orchestrator | 2026-04-20 03:44:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:33.653961 | orchestrator | 2026-04-20 03:44:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:33.655338 | orchestrator | 2026-04-20 03:44:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:33.655481 | orchestrator | 2026-04-20 03:44:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:36.703724 | orchestrator | 2026-04-20 03:44:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:36.706471 | orchestrator | 2026-04-20 03:44:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:36.706515 | orchestrator | 2026-04-20 03:44:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:39.756507 | orchestrator | 2026-04-20 03:44:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:39.758441 | orchestrator | 2026-04-20 03:44:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:39.758480 | orchestrator | 2026-04-20 03:44:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:42.809642 | orchestrator | 2026-04-20 03:44:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:42.811383 | orchestrator | 2026-04-20 03:44:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:42.811483 | orchestrator | 2026-04-20 03:44:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:45.862933 | orchestrator | 2026-04-20 03:44:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:45.863425 | orchestrator | 2026-04-20 03:44:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:45.863473 | orchestrator | 2026-04-20 03:44:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:48.915486 | orchestrator | 2026-04-20 03:44:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:48.918231 | orchestrator | 2026-04-20 03:44:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:48.918313 | orchestrator | 2026-04-20 03:44:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:51.966184 | orchestrator | 2026-04-20 03:44:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:51.968317 | orchestrator | 2026-04-20 03:44:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:51.968392 | orchestrator | 2026-04-20 03:44:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:55.022251 | orchestrator | 2026-04-20 03:44:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:55.023227 | orchestrator | 2026-04-20 03:44:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:55.023371 | orchestrator | 2026-04-20 03:44:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:44:58.069232 | orchestrator | 2026-04-20 03:44:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:44:58.070712 | orchestrator | 2026-04-20 03:44:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:44:58.070739 | orchestrator | 2026-04-20 03:44:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:01.109195 | orchestrator | 2026-04-20 03:45:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:01.110793 | orchestrator | 2026-04-20 03:45:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:01.110833 | orchestrator | 2026-04-20 03:45:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:04.153260 | orchestrator | 2026-04-20 03:45:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:04.155349 | orchestrator | 2026-04-20 03:45:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:04.155418 | orchestrator | 2026-04-20 03:45:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:07.197800 | orchestrator | 2026-04-20 03:45:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:07.199574 | orchestrator | 2026-04-20 03:45:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:07.199648 | orchestrator | 2026-04-20 03:45:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:10.245970 | orchestrator | 2026-04-20 03:45:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:10.247729 | orchestrator | 2026-04-20 03:45:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:10.247779 | orchestrator | 2026-04-20 03:45:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:13.293625 | orchestrator | 2026-04-20 03:45:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:13.296795 | orchestrator | 2026-04-20 03:45:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:13.296880 | orchestrator | 2026-04-20 03:45:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:16.336575 | orchestrator | 2026-04-20 03:45:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:16.337735 | orchestrator | 2026-04-20 03:45:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:16.337986 | orchestrator | 2026-04-20 03:45:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:19.394713 | orchestrator | 2026-04-20 03:45:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:19.395820 | orchestrator | 2026-04-20 03:45:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:19.395846 | orchestrator | 2026-04-20 03:45:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:22.447504 | orchestrator | 2026-04-20 03:45:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:22.449437 | orchestrator | 2026-04-20 03:45:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:22.449524 | orchestrator | 2026-04-20 03:45:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:25.502320 | orchestrator | 2026-04-20 03:45:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:25.505741 | orchestrator | 2026-04-20 03:45:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:25.505816 | orchestrator | 2026-04-20 03:45:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:28.556864 | orchestrator | 2026-04-20 03:45:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:28.557828 | orchestrator | 2026-04-20 03:45:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:28.557886 | orchestrator | 2026-04-20 03:45:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:31.606516 | orchestrator | 2026-04-20 03:45:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:31.608019 | orchestrator | 2026-04-20 03:45:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:31.608108 | orchestrator | 2026-04-20 03:45:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:34.661111 | orchestrator | 2026-04-20 03:45:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:34.663342 | orchestrator | 2026-04-20 03:45:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:34.663388 | orchestrator | 2026-04-20 03:45:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:37.708058 | orchestrator | 2026-04-20 03:45:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:37.709778 | orchestrator | 2026-04-20 03:45:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:37.709828 | orchestrator | 2026-04-20 03:45:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:40.756804 | orchestrator | 2026-04-20 03:45:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:40.757796 | orchestrator | 2026-04-20 03:45:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:40.757872 | orchestrator | 2026-04-20 03:45:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:43.802196 | orchestrator | 2026-04-20 03:45:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:43.803757 | orchestrator | 2026-04-20 03:45:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:43.803867 | orchestrator | 2026-04-20 03:45:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:46.847947 | orchestrator | 2026-04-20 03:45:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:46.849151 | orchestrator | 2026-04-20 03:45:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:46.849298 | orchestrator | 2026-04-20 03:45:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:49.901919 | orchestrator | 2026-04-20 03:45:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:49.903054 | orchestrator | 2026-04-20 03:45:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:49.903136 | orchestrator | 2026-04-20 03:45:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:52.955444 | orchestrator | 2026-04-20 03:45:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:52.958396 | orchestrator | 2026-04-20 03:45:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:52.958575 | orchestrator | 2026-04-20 03:45:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:56.005665 | orchestrator | 2026-04-20 03:45:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:56.008170 | orchestrator | 2026-04-20 03:45:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:56.008235 | orchestrator | 2026-04-20 03:45:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:45:59.058333 | orchestrator | 2026-04-20 03:45:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:45:59.059776 | orchestrator | 2026-04-20 03:45:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:45:59.059830 | orchestrator | 2026-04-20 03:45:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:02.104159 | orchestrator | 2026-04-20 03:46:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:02.106403 | orchestrator | 2026-04-20 03:46:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:02.106458 | orchestrator | 2026-04-20 03:46:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:05.157349 | orchestrator | 2026-04-20 03:46:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:05.159214 | orchestrator | 2026-04-20 03:46:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:05.159298 | orchestrator | 2026-04-20 03:46:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:08.209693 | orchestrator | 2026-04-20 03:46:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:08.211781 | orchestrator | 2026-04-20 03:46:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:08.211828 | orchestrator | 2026-04-20 03:46:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:11.259240 | orchestrator | 2026-04-20 03:46:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:11.261399 | orchestrator | 2026-04-20 03:46:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:11.261485 | orchestrator | 2026-04-20 03:46:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:14.310590 | orchestrator | 2026-04-20 03:46:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:14.312692 | orchestrator | 2026-04-20 03:46:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:14.312894 | orchestrator | 2026-04-20 03:46:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:17.362981 | orchestrator | 2026-04-20 03:46:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:17.363921 | orchestrator | 2026-04-20 03:46:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:17.363950 | orchestrator | 2026-04-20 03:46:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:20.412049 | orchestrator | 2026-04-20 03:46:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:20.415015 | orchestrator | 2026-04-20 03:46:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:20.415592 | orchestrator | 2026-04-20 03:46:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:23.458333 | orchestrator | 2026-04-20 03:46:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:23.459131 | orchestrator | 2026-04-20 03:46:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:23.459168 | orchestrator | 2026-04-20 03:46:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:26.511918 | orchestrator | 2026-04-20 03:46:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:26.513542 | orchestrator | 2026-04-20 03:46:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:26.513607 | orchestrator | 2026-04-20 03:46:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:29.568821 | orchestrator | 2026-04-20 03:46:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:29.572214 | orchestrator | 2026-04-20 03:46:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:29.572280 | orchestrator | 2026-04-20 03:46:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:32.625096 | orchestrator | 2026-04-20 03:46:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:32.627482 | orchestrator | 2026-04-20 03:46:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:32.627568 | orchestrator | 2026-04-20 03:46:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:35.680906 | orchestrator | 2026-04-20 03:46:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:35.683377 | orchestrator | 2026-04-20 03:46:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:35.683475 | orchestrator | 2026-04-20 03:46:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:38.735724 | orchestrator | 2026-04-20 03:46:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:38.737240 | orchestrator | 2026-04-20 03:46:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:38.737631 | orchestrator | 2026-04-20 03:46:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:41.794349 | orchestrator | 2026-04-20 03:46:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:41.795191 | orchestrator | 2026-04-20 03:46:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:41.795229 | orchestrator | 2026-04-20 03:46:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:44.841893 | orchestrator | 2026-04-20 03:46:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:44.844124 | orchestrator | 2026-04-20 03:46:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:44.844193 | orchestrator | 2026-04-20 03:46:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:47.895940 | orchestrator | 2026-04-20 03:46:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:47.901481 | orchestrator | 2026-04-20 03:46:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:47.901554 | orchestrator | 2026-04-20 03:46:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:50.954377 | orchestrator | 2026-04-20 03:46:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:50.956470 | orchestrator | 2026-04-20 03:46:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:50.956566 | orchestrator | 2026-04-20 03:46:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:54.000862 | orchestrator | 2026-04-20 03:46:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:54.002273 | orchestrator | 2026-04-20 03:46:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:54.002348 | orchestrator | 2026-04-20 03:46:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:46:57.047594 | orchestrator | 2026-04-20 03:46:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:46:57.048732 | orchestrator | 2026-04-20 03:46:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:46:57.048883 | orchestrator | 2026-04-20 03:46:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:00.085153 | orchestrator | 2026-04-20 03:47:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:00.086679 | orchestrator | 2026-04-20 03:47:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:00.086816 | orchestrator | 2026-04-20 03:47:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:03.134874 | orchestrator | 2026-04-20 03:47:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:03.136827 | orchestrator | 2026-04-20 03:47:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:03.136905 | orchestrator | 2026-04-20 03:47:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:06.186067 | orchestrator | 2026-04-20 03:47:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:06.186265 | orchestrator | 2026-04-20 03:47:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:06.186289 | orchestrator | 2026-04-20 03:47:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:09.235731 | orchestrator | 2026-04-20 03:47:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:09.238105 | orchestrator | 2026-04-20 03:47:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:09.238588 | orchestrator | 2026-04-20 03:47:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:12.290453 | orchestrator | 2026-04-20 03:47:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:12.291377 | orchestrator | 2026-04-20 03:47:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:12.291423 | orchestrator | 2026-04-20 03:47:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:15.344888 | orchestrator | 2026-04-20 03:47:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:15.346917 | orchestrator | 2026-04-20 03:47:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:15.346983 | orchestrator | 2026-04-20 03:47:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:18.389436 | orchestrator | 2026-04-20 03:47:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:18.390082 | orchestrator | 2026-04-20 03:47:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:18.390149 | orchestrator | 2026-04-20 03:47:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:21.434626 | orchestrator | 2026-04-20 03:47:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:21.435741 | orchestrator | 2026-04-20 03:47:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:21.435836 | orchestrator | 2026-04-20 03:47:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:24.486100 | orchestrator | 2026-04-20 03:47:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:24.487841 | orchestrator | 2026-04-20 03:47:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:24.487872 | orchestrator | 2026-04-20 03:47:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:27.538919 | orchestrator | 2026-04-20 03:47:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:27.541083 | orchestrator | 2026-04-20 03:47:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:27.541115 | orchestrator | 2026-04-20 03:47:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:30.590263 | orchestrator | 2026-04-20 03:47:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:30.591763 | orchestrator | 2026-04-20 03:47:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:30.591787 | orchestrator | 2026-04-20 03:47:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:33.643390 | orchestrator | 2026-04-20 03:47:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:33.644973 | orchestrator | 2026-04-20 03:47:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:33.645027 | orchestrator | 2026-04-20 03:47:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:36.697191 | orchestrator | 2026-04-20 03:47:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:36.697739 | orchestrator | 2026-04-20 03:47:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:36.697874 | orchestrator | 2026-04-20 03:47:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:39.748898 | orchestrator | 2026-04-20 03:47:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:39.749180 | orchestrator | 2026-04-20 03:47:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:39.749203 | orchestrator | 2026-04-20 03:47:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:42.793468 | orchestrator | 2026-04-20 03:47:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:42.794905 | orchestrator | 2026-04-20 03:47:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:42.794990 | orchestrator | 2026-04-20 03:47:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:45.842895 | orchestrator | 2026-04-20 03:47:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:45.844214 | orchestrator | 2026-04-20 03:47:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:45.844440 | orchestrator | 2026-04-20 03:47:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:48.892639 | orchestrator | 2026-04-20 03:47:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:48.894477 | orchestrator | 2026-04-20 03:47:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:48.894501 | orchestrator | 2026-04-20 03:47:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:51.944780 | orchestrator | 2026-04-20 03:47:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:51.947403 | orchestrator | 2026-04-20 03:47:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:51.947469 | orchestrator | 2026-04-20 03:47:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:54.992465 | orchestrator | 2026-04-20 03:47:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:54.993375 | orchestrator | 2026-04-20 03:47:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:54.993489 | orchestrator | 2026-04-20 03:47:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:47:58.035957 | orchestrator | 2026-04-20 03:47:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:47:58.037477 | orchestrator | 2026-04-20 03:47:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:47:58.037520 | orchestrator | 2026-04-20 03:47:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:01.073647 | orchestrator | 2026-04-20 03:48:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:01.076943 | orchestrator | 2026-04-20 03:48:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:01.077012 | orchestrator | 2026-04-20 03:48:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:04.120355 | orchestrator | 2026-04-20 03:48:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:04.122681 | orchestrator | 2026-04-20 03:48:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:04.122753 | orchestrator | 2026-04-20 03:48:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:07.161395 | orchestrator | 2026-04-20 03:48:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:07.163584 | orchestrator | 2026-04-20 03:48:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:07.164144 | orchestrator | 2026-04-20 03:48:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:10.209127 | orchestrator | 2026-04-20 03:48:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:10.210508 | orchestrator | 2026-04-20 03:48:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:10.210542 | orchestrator | 2026-04-20 03:48:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:13.255435 | orchestrator | 2026-04-20 03:48:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:13.256697 | orchestrator | 2026-04-20 03:48:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:13.256993 | orchestrator | 2026-04-20 03:48:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:16.301061 | orchestrator | 2026-04-20 03:48:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:16.303013 | orchestrator | 2026-04-20 03:48:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:16.303075 | orchestrator | 2026-04-20 03:48:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:19.350743 | orchestrator | 2026-04-20 03:48:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:19.352160 | orchestrator | 2026-04-20 03:48:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:19.352262 | orchestrator | 2026-04-20 03:48:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:22.395664 | orchestrator | 2026-04-20 03:48:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:22.397026 | orchestrator | 2026-04-20 03:48:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:22.397101 | orchestrator | 2026-04-20 03:48:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:25.439945 | orchestrator | 2026-04-20 03:48:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:25.441601 | orchestrator | 2026-04-20 03:48:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:25.441654 | orchestrator | 2026-04-20 03:48:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:28.481466 | orchestrator | 2026-04-20 03:48:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:28.481846 | orchestrator | 2026-04-20 03:48:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:28.481928 | orchestrator | 2026-04-20 03:48:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:31.530822 | orchestrator | 2026-04-20 03:48:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:31.533424 | orchestrator | 2026-04-20 03:48:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:31.533479 | orchestrator | 2026-04-20 03:48:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:34.577343 | orchestrator | 2026-04-20 03:48:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:34.579133 | orchestrator | 2026-04-20 03:48:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:34.579286 | orchestrator | 2026-04-20 03:48:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:37.621589 | orchestrator | 2026-04-20 03:48:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:37.624129 | orchestrator | 2026-04-20 03:48:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:37.624198 | orchestrator | 2026-04-20 03:48:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:40.663681 | orchestrator | 2026-04-20 03:48:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:40.667794 | orchestrator | 2026-04-20 03:48:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:40.667977 | orchestrator | 2026-04-20 03:48:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:43.710443 | orchestrator | 2026-04-20 03:48:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:43.711089 | orchestrator | 2026-04-20 03:48:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:43.711150 | orchestrator | 2026-04-20 03:48:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:46.757926 | orchestrator | 2026-04-20 03:48:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:46.760867 | orchestrator | 2026-04-20 03:48:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:46.760979 | orchestrator | 2026-04-20 03:48:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:49.810392 | orchestrator | 2026-04-20 03:48:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:49.811853 | orchestrator | 2026-04-20 03:48:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:49.811950 | orchestrator | 2026-04-20 03:48:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:52.857542 | orchestrator | 2026-04-20 03:48:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:52.858231 | orchestrator | 2026-04-20 03:48:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:52.858274 | orchestrator | 2026-04-20 03:48:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:55.910848 | orchestrator | 2026-04-20 03:48:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:55.911978 | orchestrator | 2026-04-20 03:48:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:55.912023 | orchestrator | 2026-04-20 03:48:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:48:58.960459 | orchestrator | 2026-04-20 03:48:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:48:58.962458 | orchestrator | 2026-04-20 03:48:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:48:58.962560 | orchestrator | 2026-04-20 03:48:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:02.014372 | orchestrator | 2026-04-20 03:49:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:02.017287 | orchestrator | 2026-04-20 03:49:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:02.017382 | orchestrator | 2026-04-20 03:49:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:05.067257 | orchestrator | 2026-04-20 03:49:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:05.069477 | orchestrator | 2026-04-20 03:49:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:05.069568 | orchestrator | 2026-04-20 03:49:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:08.114248 | orchestrator | 2026-04-20 03:49:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:08.116625 | orchestrator | 2026-04-20 03:49:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:08.116795 | orchestrator | 2026-04-20 03:49:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:11.168769 | orchestrator | 2026-04-20 03:49:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:11.170852 | orchestrator | 2026-04-20 03:49:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:11.171203 | orchestrator | 2026-04-20 03:49:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:14.229132 | orchestrator | 2026-04-20 03:49:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:14.230767 | orchestrator | 2026-04-20 03:49:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:14.230814 | orchestrator | 2026-04-20 03:49:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:17.283358 | orchestrator | 2026-04-20 03:49:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:17.285479 | orchestrator | 2026-04-20 03:49:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:17.285554 | orchestrator | 2026-04-20 03:49:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:20.326891 | orchestrator | 2026-04-20 03:49:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:20.329276 | orchestrator | 2026-04-20 03:49:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:20.329479 | orchestrator | 2026-04-20 03:49:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:23.373688 | orchestrator | 2026-04-20 03:49:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:23.376448 | orchestrator | 2026-04-20 03:49:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:23.376588 | orchestrator | 2026-04-20 03:49:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:26.423838 | orchestrator | 2026-04-20 03:49:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:26.426853 | orchestrator | 2026-04-20 03:49:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:26.426997 | orchestrator | 2026-04-20 03:49:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:29.473462 | orchestrator | 2026-04-20 03:49:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:29.475353 | orchestrator | 2026-04-20 03:49:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:29.475405 | orchestrator | 2026-04-20 03:49:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:32.521719 | orchestrator | 2026-04-20 03:49:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:32.522461 | orchestrator | 2026-04-20 03:49:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:32.522499 | orchestrator | 2026-04-20 03:49:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:35.571090 | orchestrator | 2026-04-20 03:49:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:35.573564 | orchestrator | 2026-04-20 03:49:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:35.573646 | orchestrator | 2026-04-20 03:49:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:38.622588 | orchestrator | 2026-04-20 03:49:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:38.625642 | orchestrator | 2026-04-20 03:49:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:38.625682 | orchestrator | 2026-04-20 03:49:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:41.675126 | orchestrator | 2026-04-20 03:49:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:41.676591 | orchestrator | 2026-04-20 03:49:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:41.676616 | orchestrator | 2026-04-20 03:49:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:44.725120 | orchestrator | 2026-04-20 03:49:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:44.725856 | orchestrator | 2026-04-20 03:49:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:44.726266 | orchestrator | 2026-04-20 03:49:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:47.783478 | orchestrator | 2026-04-20 03:49:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:47.784499 | orchestrator | 2026-04-20 03:49:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:47.784622 | orchestrator | 2026-04-20 03:49:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:50.832178 | orchestrator | 2026-04-20 03:49:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:50.833008 | orchestrator | 2026-04-20 03:49:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:50.833050 | orchestrator | 2026-04-20 03:49:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:53.884134 | orchestrator | 2026-04-20 03:49:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:53.886204 | orchestrator | 2026-04-20 03:49:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:53.886286 | orchestrator | 2026-04-20 03:49:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:56.925425 | orchestrator | 2026-04-20 03:49:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:56.927121 | orchestrator | 2026-04-20 03:49:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:56.927537 | orchestrator | 2026-04-20 03:49:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:49:59.972513 | orchestrator | 2026-04-20 03:49:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:49:59.973996 | orchestrator | 2026-04-20 03:49:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:49:59.974196 | orchestrator | 2026-04-20 03:49:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:03.025384 | orchestrator | 2026-04-20 03:50:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:03.027524 | orchestrator | 2026-04-20 03:50:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:03.027594 | orchestrator | 2026-04-20 03:50:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:06.076016 | orchestrator | 2026-04-20 03:50:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:06.077198 | orchestrator | 2026-04-20 03:50:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:06.077602 | orchestrator | 2026-04-20 03:50:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:09.126250 | orchestrator | 2026-04-20 03:50:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:09.128516 | orchestrator | 2026-04-20 03:50:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:09.128591 | orchestrator | 2026-04-20 03:50:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:12.172093 | orchestrator | 2026-04-20 03:50:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:12.173092 | orchestrator | 2026-04-20 03:50:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:12.173191 | orchestrator | 2026-04-20 03:50:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:15.212496 | orchestrator | 2026-04-20 03:50:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:15.213666 | orchestrator | 2026-04-20 03:50:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:15.213757 | orchestrator | 2026-04-20 03:50:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:18.261163 | orchestrator | 2026-04-20 03:50:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:18.262815 | orchestrator | 2026-04-20 03:50:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:18.262863 | orchestrator | 2026-04-20 03:50:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:21.302522 | orchestrator | 2026-04-20 03:50:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:21.303822 | orchestrator | 2026-04-20 03:50:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:21.303969 | orchestrator | 2026-04-20 03:50:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:24.358492 | orchestrator | 2026-04-20 03:50:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:24.360199 | orchestrator | 2026-04-20 03:50:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:24.360256 | orchestrator | 2026-04-20 03:50:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:27.407831 | orchestrator | 2026-04-20 03:50:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:27.409970 | orchestrator | 2026-04-20 03:50:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:27.410172 | orchestrator | 2026-04-20 03:50:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:30.451717 | orchestrator | 2026-04-20 03:50:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:30.452947 | orchestrator | 2026-04-20 03:50:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:30.453006 | orchestrator | 2026-04-20 03:50:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:33.504169 | orchestrator | 2026-04-20 03:50:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:33.507694 | orchestrator | 2026-04-20 03:50:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:33.507793 | orchestrator | 2026-04-20 03:50:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:36.558422 | orchestrator | 2026-04-20 03:50:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:36.559876 | orchestrator | 2026-04-20 03:50:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:36.559978 | orchestrator | 2026-04-20 03:50:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:39.606433 | orchestrator | 2026-04-20 03:50:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:39.607999 | orchestrator | 2026-04-20 03:50:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:39.608066 | orchestrator | 2026-04-20 03:50:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:42.653919 | orchestrator | 2026-04-20 03:50:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:42.655274 | orchestrator | 2026-04-20 03:50:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:42.655346 | orchestrator | 2026-04-20 03:50:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:45.703381 | orchestrator | 2026-04-20 03:50:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:45.706388 | orchestrator | 2026-04-20 03:50:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:45.706470 | orchestrator | 2026-04-20 03:50:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:48.759523 | orchestrator | 2026-04-20 03:50:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:48.761153 | orchestrator | 2026-04-20 03:50:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:48.761203 | orchestrator | 2026-04-20 03:50:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:51.811489 | orchestrator | 2026-04-20 03:50:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:51.812327 | orchestrator | 2026-04-20 03:50:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:51.812369 | orchestrator | 2026-04-20 03:50:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:54.860973 | orchestrator | 2026-04-20 03:50:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:54.862371 | orchestrator | 2026-04-20 03:50:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:54.862630 | orchestrator | 2026-04-20 03:50:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:50:57.909498 | orchestrator | 2026-04-20 03:50:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:50:57.911301 | orchestrator | 2026-04-20 03:50:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:50:57.911351 | orchestrator | 2026-04-20 03:50:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:00.960577 | orchestrator | 2026-04-20 03:51:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:00.962204 | orchestrator | 2026-04-20 03:51:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:00.962272 | orchestrator | 2026-04-20 03:51:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:04.015737 | orchestrator | 2026-04-20 03:51:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:04.018575 | orchestrator | 2026-04-20 03:51:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:04.018689 | orchestrator | 2026-04-20 03:51:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:07.060508 | orchestrator | 2026-04-20 03:51:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:07.060696 | orchestrator | 2026-04-20 03:51:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:07.060709 | orchestrator | 2026-04-20 03:51:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:10.105191 | orchestrator | 2026-04-20 03:51:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:10.105898 | orchestrator | 2026-04-20 03:51:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:10.105932 | orchestrator | 2026-04-20 03:51:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:13.152694 | orchestrator | 2026-04-20 03:51:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:13.154226 | orchestrator | 2026-04-20 03:51:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:13.154373 | orchestrator | 2026-04-20 03:51:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:16.201644 | orchestrator | 2026-04-20 03:51:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:16.203117 | orchestrator | 2026-04-20 03:51:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:16.203270 | orchestrator | 2026-04-20 03:51:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:19.249256 | orchestrator | 2026-04-20 03:51:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:19.250844 | orchestrator | 2026-04-20 03:51:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:19.250925 | orchestrator | 2026-04-20 03:51:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:22.289460 | orchestrator | 2026-04-20 03:51:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:22.290550 | orchestrator | 2026-04-20 03:51:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:22.290595 | orchestrator | 2026-04-20 03:51:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:25.342285 | orchestrator | 2026-04-20 03:51:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:25.343903 | orchestrator | 2026-04-20 03:51:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:25.343947 | orchestrator | 2026-04-20 03:51:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:28.394120 | orchestrator | 2026-04-20 03:51:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:28.395354 | orchestrator | 2026-04-20 03:51:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:28.395454 | orchestrator | 2026-04-20 03:51:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:31.448880 | orchestrator | 2026-04-20 03:51:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:31.451297 | orchestrator | 2026-04-20 03:51:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:31.451356 | orchestrator | 2026-04-20 03:51:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:34.500406 | orchestrator | 2026-04-20 03:51:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:34.501291 | orchestrator | 2026-04-20 03:51:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:34.501320 | orchestrator | 2026-04-20 03:51:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:37.557185 | orchestrator | 2026-04-20 03:51:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:37.559816 | orchestrator | 2026-04-20 03:51:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:37.559994 | orchestrator | 2026-04-20 03:51:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:40.609423 | orchestrator | 2026-04-20 03:51:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:40.611285 | orchestrator | 2026-04-20 03:51:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:40.611327 | orchestrator | 2026-04-20 03:51:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:43.663601 | orchestrator | 2026-04-20 03:51:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:43.664945 | orchestrator | 2026-04-20 03:51:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:43.665042 | orchestrator | 2026-04-20 03:51:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:46.714819 | orchestrator | 2026-04-20 03:51:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:46.716372 | orchestrator | 2026-04-20 03:51:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:46.716534 | orchestrator | 2026-04-20 03:51:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:49.757356 | orchestrator | 2026-04-20 03:51:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:49.759827 | orchestrator | 2026-04-20 03:51:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:49.759890 | orchestrator | 2026-04-20 03:51:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:52.806707 | orchestrator | 2026-04-20 03:51:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:52.808533 | orchestrator | 2026-04-20 03:51:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:52.808582 | orchestrator | 2026-04-20 03:51:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:55.857225 | orchestrator | 2026-04-20 03:51:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:55.859077 | orchestrator | 2026-04-20 03:51:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:55.859143 | orchestrator | 2026-04-20 03:51:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:51:58.904349 | orchestrator | 2026-04-20 03:51:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:51:58.906411 | orchestrator | 2026-04-20 03:51:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:51:58.906477 | orchestrator | 2026-04-20 03:51:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:01.949084 | orchestrator | 2026-04-20 03:52:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:01.950348 | orchestrator | 2026-04-20 03:52:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:01.950390 | orchestrator | 2026-04-20 03:52:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:04.995164 | orchestrator | 2026-04-20 03:52:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:04.996758 | orchestrator | 2026-04-20 03:52:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:04.996782 | orchestrator | 2026-04-20 03:52:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:08.051436 | orchestrator | 2026-04-20 03:52:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:08.052225 | orchestrator | 2026-04-20 03:52:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:08.052268 | orchestrator | 2026-04-20 03:52:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:11.100783 | orchestrator | 2026-04-20 03:52:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:11.101700 | orchestrator | 2026-04-20 03:52:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:11.101743 | orchestrator | 2026-04-20 03:52:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:14.149926 | orchestrator | 2026-04-20 03:52:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:14.151952 | orchestrator | 2026-04-20 03:52:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:14.152010 | orchestrator | 2026-04-20 03:52:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:17.196073 | orchestrator | 2026-04-20 03:52:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:17.198814 | orchestrator | 2026-04-20 03:52:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:17.198926 | orchestrator | 2026-04-20 03:52:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:20.243089 | orchestrator | 2026-04-20 03:52:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:20.243511 | orchestrator | 2026-04-20 03:52:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:20.243658 | orchestrator | 2026-04-20 03:52:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:23.290624 | orchestrator | 2026-04-20 03:52:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:23.291697 | orchestrator | 2026-04-20 03:52:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:23.291744 | orchestrator | 2026-04-20 03:52:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:26.339347 | orchestrator | 2026-04-20 03:52:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:26.340798 | orchestrator | 2026-04-20 03:52:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:26.340881 | orchestrator | 2026-04-20 03:52:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:29.383741 | orchestrator | 2026-04-20 03:52:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:29.385368 | orchestrator | 2026-04-20 03:52:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:29.385421 | orchestrator | 2026-04-20 03:52:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:32.434310 | orchestrator | 2026-04-20 03:52:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:32.437080 | orchestrator | 2026-04-20 03:52:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:32.437129 | orchestrator | 2026-04-20 03:52:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:35.483352 | orchestrator | 2026-04-20 03:52:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:35.484476 | orchestrator | 2026-04-20 03:52:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:35.484901 | orchestrator | 2026-04-20 03:52:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:38.533369 | orchestrator | 2026-04-20 03:52:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:38.536899 | orchestrator | 2026-04-20 03:52:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:38.536991 | orchestrator | 2026-04-20 03:52:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:41.581438 | orchestrator | 2026-04-20 03:52:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:41.582740 | orchestrator | 2026-04-20 03:52:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:41.582790 | orchestrator | 2026-04-20 03:52:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:44.630864 | orchestrator | 2026-04-20 03:52:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:44.633124 | orchestrator | 2026-04-20 03:52:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:44.633220 | orchestrator | 2026-04-20 03:52:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:47.682166 | orchestrator | 2026-04-20 03:52:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:47.683777 | orchestrator | 2026-04-20 03:52:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:47.683807 | orchestrator | 2026-04-20 03:52:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:50.727519 | orchestrator | 2026-04-20 03:52:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:50.729155 | orchestrator | 2026-04-20 03:52:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:50.729324 | orchestrator | 2026-04-20 03:52:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:53.779090 | orchestrator | 2026-04-20 03:52:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:53.780888 | orchestrator | 2026-04-20 03:52:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:53.780938 | orchestrator | 2026-04-20 03:52:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:56.824674 | orchestrator | 2026-04-20 03:52:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:56.826350 | orchestrator | 2026-04-20 03:52:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:56.826405 | orchestrator | 2026-04-20 03:52:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:52:59.874120 | orchestrator | 2026-04-20 03:52:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:52:59.875767 | orchestrator | 2026-04-20 03:52:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:52:59.875843 | orchestrator | 2026-04-20 03:52:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:02.923753 | orchestrator | 2026-04-20 03:53:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:02.925733 | orchestrator | 2026-04-20 03:53:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:02.925792 | orchestrator | 2026-04-20 03:53:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:05.978821 | orchestrator | 2026-04-20 03:53:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:05.980077 | orchestrator | 2026-04-20 03:53:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:05.980132 | orchestrator | 2026-04-20 03:53:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:09.028944 | orchestrator | 2026-04-20 03:53:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:09.030178 | orchestrator | 2026-04-20 03:53:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:09.030218 | orchestrator | 2026-04-20 03:53:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:12.075763 | orchestrator | 2026-04-20 03:53:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:12.077392 | orchestrator | 2026-04-20 03:53:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:12.077460 | orchestrator | 2026-04-20 03:53:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:15.128845 | orchestrator | 2026-04-20 03:53:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:15.130223 | orchestrator | 2026-04-20 03:53:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:15.130277 | orchestrator | 2026-04-20 03:53:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:18.184075 | orchestrator | 2026-04-20 03:53:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:18.186199 | orchestrator | 2026-04-20 03:53:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:18.186280 | orchestrator | 2026-04-20 03:53:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:21.228157 | orchestrator | 2026-04-20 03:53:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:21.230384 | orchestrator | 2026-04-20 03:53:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:21.230418 | orchestrator | 2026-04-20 03:53:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:24.280665 | orchestrator | 2026-04-20 03:53:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:24.282799 | orchestrator | 2026-04-20 03:53:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:24.282977 | orchestrator | 2026-04-20 03:53:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:27.330681 | orchestrator | 2026-04-20 03:53:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:27.332668 | orchestrator | 2026-04-20 03:53:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:27.332862 | orchestrator | 2026-04-20 03:53:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:30.375009 | orchestrator | 2026-04-20 03:53:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:30.375885 | orchestrator | 2026-04-20 03:53:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:30.375981 | orchestrator | 2026-04-20 03:53:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:33.412393 | orchestrator | 2026-04-20 03:53:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:33.412669 | orchestrator | 2026-04-20 03:53:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:33.412708 | orchestrator | 2026-04-20 03:53:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:36.462840 | orchestrator | 2026-04-20 03:53:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:36.464021 | orchestrator | 2026-04-20 03:53:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:36.464064 | orchestrator | 2026-04-20 03:53:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:39.518180 | orchestrator | 2026-04-20 03:53:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:39.519401 | orchestrator | 2026-04-20 03:53:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:39.519468 | orchestrator | 2026-04-20 03:53:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:42.559998 | orchestrator | 2026-04-20 03:53:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:42.562313 | orchestrator | 2026-04-20 03:53:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:42.562378 | orchestrator | 2026-04-20 03:53:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:45.608675 | orchestrator | 2026-04-20 03:53:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:45.611146 | orchestrator | 2026-04-20 03:53:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:45.611202 | orchestrator | 2026-04-20 03:53:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:48.658395 | orchestrator | 2026-04-20 03:53:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:48.659591 | orchestrator | 2026-04-20 03:53:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:48.659653 | orchestrator | 2026-04-20 03:53:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:51.703848 | orchestrator | 2026-04-20 03:53:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:51.705310 | orchestrator | 2026-04-20 03:53:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:51.705350 | orchestrator | 2026-04-20 03:53:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:54.754633 | orchestrator | 2026-04-20 03:53:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:54.755905 | orchestrator | 2026-04-20 03:53:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:54.756002 | orchestrator | 2026-04-20 03:53:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:53:57.802239 | orchestrator | 2026-04-20 03:53:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:53:57.804793 | orchestrator | 2026-04-20 03:53:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:53:57.804883 | orchestrator | 2026-04-20 03:53:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:00.849728 | orchestrator | 2026-04-20 03:54:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:00.851164 | orchestrator | 2026-04-20 03:54:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:00.851298 | orchestrator | 2026-04-20 03:54:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:03.899295 | orchestrator | 2026-04-20 03:54:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:03.900299 | orchestrator | 2026-04-20 03:54:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:03.900324 | orchestrator | 2026-04-20 03:54:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:06.952505 | orchestrator | 2026-04-20 03:54:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:06.954124 | orchestrator | 2026-04-20 03:54:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:06.954166 | orchestrator | 2026-04-20 03:54:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:10.016363 | orchestrator | 2026-04-20 03:54:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:10.018332 | orchestrator | 2026-04-20 03:54:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:10.018492 | orchestrator | 2026-04-20 03:54:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:13.066285 | orchestrator | 2026-04-20 03:54:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:13.068789 | orchestrator | 2026-04-20 03:54:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:13.068881 | orchestrator | 2026-04-20 03:54:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:16.129065 | orchestrator | 2026-04-20 03:54:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:16.131049 | orchestrator | 2026-04-20 03:54:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:16.131146 | orchestrator | 2026-04-20 03:54:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:19.178459 | orchestrator | 2026-04-20 03:54:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:19.179670 | orchestrator | 2026-04-20 03:54:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:19.179779 | orchestrator | 2026-04-20 03:54:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:22.230406 | orchestrator | 2026-04-20 03:54:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:22.233580 | orchestrator | 2026-04-20 03:54:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:22.233680 | orchestrator | 2026-04-20 03:54:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:25.279643 | orchestrator | 2026-04-20 03:54:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:25.281116 | orchestrator | 2026-04-20 03:54:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:25.281164 | orchestrator | 2026-04-20 03:54:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:28.330213 | orchestrator | 2026-04-20 03:54:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:28.331547 | orchestrator | 2026-04-20 03:54:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:28.331603 | orchestrator | 2026-04-20 03:54:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:31.376284 | orchestrator | 2026-04-20 03:54:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:31.377734 | orchestrator | 2026-04-20 03:54:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:31.377945 | orchestrator | 2026-04-20 03:54:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:34.423976 | orchestrator | 2026-04-20 03:54:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:34.425403 | orchestrator | 2026-04-20 03:54:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:34.425656 | orchestrator | 2026-04-20 03:54:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:37.473880 | orchestrator | 2026-04-20 03:54:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:37.477361 | orchestrator | 2026-04-20 03:54:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:37.477453 | orchestrator | 2026-04-20 03:54:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:40.521891 | orchestrator | 2026-04-20 03:54:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:40.524041 | orchestrator | 2026-04-20 03:54:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:40.524093 | orchestrator | 2026-04-20 03:54:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:43.560208 | orchestrator | 2026-04-20 03:54:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:43.560361 | orchestrator | 2026-04-20 03:54:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:43.560395 | orchestrator | 2026-04-20 03:54:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:46.608053 | orchestrator | 2026-04-20 03:54:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:46.608295 | orchestrator | 2026-04-20 03:54:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:46.608316 | orchestrator | 2026-04-20 03:54:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:49.651776 | orchestrator | 2026-04-20 03:54:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:49.653503 | orchestrator | 2026-04-20 03:54:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:49.654090 | orchestrator | 2026-04-20 03:54:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:52.699545 | orchestrator | 2026-04-20 03:54:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:52.701915 | orchestrator | 2026-04-20 03:54:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:52.701983 | orchestrator | 2026-04-20 03:54:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:55.747617 | orchestrator | 2026-04-20 03:54:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:55.749101 | orchestrator | 2026-04-20 03:54:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:55.749308 | orchestrator | 2026-04-20 03:54:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:54:58.792605 | orchestrator | 2026-04-20 03:54:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:54:58.793239 | orchestrator | 2026-04-20 03:54:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:54:58.793341 | orchestrator | 2026-04-20 03:54:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:55:01.844249 | orchestrator | 2026-04-20 03:55:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:55:01.845730 | orchestrator | 2026-04-20 03:55:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:55:01.845792 | orchestrator | 2026-04-20 03:55:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:55:04.893916 | orchestrator | 2026-04-20 03:55:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:55:04.895575 | orchestrator | 2026-04-20 03:55:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:55:04.895658 | orchestrator | 2026-04-20 03:55:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:55:07.948808 | orchestrator | 2026-04-20 03:55:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:55:07.950644 | orchestrator | 2026-04-20 03:55:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:55:07.950881 | orchestrator | 2026-04-20 03:55:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:55:10.996799 | orchestrator | 2026-04-20 03:55:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:55:10.997690 | orchestrator | 2026-04-20 03:55:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:55:10.997814 | orchestrator | 2026-04-20 03:55:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:55:14.046379 | orchestrator | 2026-04-20 03:55:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:14.152120 | orchestrator | 2026-04-20 03:57:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:14.152330 | orchestrator | 2026-04-20 03:57:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:17.193346 | orchestrator | 2026-04-20 03:57:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:17.194389 | orchestrator | 2026-04-20 03:57:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:17.194449 | orchestrator | 2026-04-20 03:57:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:20.234913 | orchestrator | 2026-04-20 03:57:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:20.237492 | orchestrator | 2026-04-20 03:57:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:20.237561 | orchestrator | 2026-04-20 03:57:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:23.280965 | orchestrator | 2026-04-20 03:57:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:23.282545 | orchestrator | 2026-04-20 03:57:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:23.282677 | orchestrator | 2026-04-20 03:57:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:26.324774 | orchestrator | 2026-04-20 03:57:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:26.326382 | orchestrator | 2026-04-20 03:57:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:26.326466 | orchestrator | 2026-04-20 03:57:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:29.383034 | orchestrator | 2026-04-20 03:57:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:29.384686 | orchestrator | 2026-04-20 03:57:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:29.384861 | orchestrator | 2026-04-20 03:57:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:32.433953 | orchestrator | 2026-04-20 03:57:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:32.435507 | orchestrator | 2026-04-20 03:57:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:32.435565 | orchestrator | 2026-04-20 03:57:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:35.483523 | orchestrator | 2026-04-20 03:57:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:35.485883 | orchestrator | 2026-04-20 03:57:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:35.485922 | orchestrator | 2026-04-20 03:57:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:38.530945 | orchestrator | 2026-04-20 03:57:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:38.532495 | orchestrator | 2026-04-20 03:57:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:38.532542 | orchestrator | 2026-04-20 03:57:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:41.570724 | orchestrator | 2026-04-20 03:57:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:41.572591 | orchestrator | 2026-04-20 03:57:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:41.572789 | orchestrator | 2026-04-20 03:57:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:44.621663 | orchestrator | 2026-04-20 03:57:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:44.625072 | orchestrator | 2026-04-20 03:57:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:44.625357 | orchestrator | 2026-04-20 03:57:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:47.675395 | orchestrator | 2026-04-20 03:57:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:47.678079 | orchestrator | 2026-04-20 03:57:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:47.678207 | orchestrator | 2026-04-20 03:57:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:50.720056 | orchestrator | 2026-04-20 03:57:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:50.722337 | orchestrator | 2026-04-20 03:57:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:50.722400 | orchestrator | 2026-04-20 03:57:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:53.768893 | orchestrator | 2026-04-20 03:57:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:53.771319 | orchestrator | 2026-04-20 03:57:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:53.771399 | orchestrator | 2026-04-20 03:57:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:56.823424 | orchestrator | 2026-04-20 03:57:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:56.825024 | orchestrator | 2026-04-20 03:57:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:56.825058 | orchestrator | 2026-04-20 03:57:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:57:59.872327 | orchestrator | 2026-04-20 03:57:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:57:59.874063 | orchestrator | 2026-04-20 03:57:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:57:59.874125 | orchestrator | 2026-04-20 03:57:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:02.916162 | orchestrator | 2026-04-20 03:58:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:02.916693 | orchestrator | 2026-04-20 03:58:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:02.916726 | orchestrator | 2026-04-20 03:58:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:05.961048 | orchestrator | 2026-04-20 03:58:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:05.962622 | orchestrator | 2026-04-20 03:58:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:05.962684 | orchestrator | 2026-04-20 03:58:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:09.009013 | orchestrator | 2026-04-20 03:58:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:09.010456 | orchestrator | 2026-04-20 03:58:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:09.010542 | orchestrator | 2026-04-20 03:58:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:12.060640 | orchestrator | 2026-04-20 03:58:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:12.061781 | orchestrator | 2026-04-20 03:58:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:12.061898 | orchestrator | 2026-04-20 03:58:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:15.111900 | orchestrator | 2026-04-20 03:58:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:15.113239 | orchestrator | 2026-04-20 03:58:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:15.113322 | orchestrator | 2026-04-20 03:58:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:18.168799 | orchestrator | 2026-04-20 03:58:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:18.171001 | orchestrator | 2026-04-20 03:58:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:18.171049 | orchestrator | 2026-04-20 03:58:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:21.213074 | orchestrator | 2026-04-20 03:58:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:21.215852 | orchestrator | 2026-04-20 03:58:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:21.216047 | orchestrator | 2026-04-20 03:58:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:24.261581 | orchestrator | 2026-04-20 03:58:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:24.263611 | orchestrator | 2026-04-20 03:58:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:24.263674 | orchestrator | 2026-04-20 03:58:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:27.315001 | orchestrator | 2026-04-20 03:58:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:27.317292 | orchestrator | 2026-04-20 03:58:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:27.317371 | orchestrator | 2026-04-20 03:58:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:30.364613 | orchestrator | 2026-04-20 03:58:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:30.365827 | orchestrator | 2026-04-20 03:58:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:30.365931 | orchestrator | 2026-04-20 03:58:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:33.418375 | orchestrator | 2026-04-20 03:58:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:33.420748 | orchestrator | 2026-04-20 03:58:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:33.420832 | orchestrator | 2026-04-20 03:58:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:36.467931 | orchestrator | 2026-04-20 03:58:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:36.472007 | orchestrator | 2026-04-20 03:58:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:36.472070 | orchestrator | 2026-04-20 03:58:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:39.522480 | orchestrator | 2026-04-20 03:58:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:39.523820 | orchestrator | 2026-04-20 03:58:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:39.523920 | orchestrator | 2026-04-20 03:58:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:42.573989 | orchestrator | 2026-04-20 03:58:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:42.575162 | orchestrator | 2026-04-20 03:58:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:42.575182 | orchestrator | 2026-04-20 03:58:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:45.620311 | orchestrator | 2026-04-20 03:58:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:45.622265 | orchestrator | 2026-04-20 03:58:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:45.622388 | orchestrator | 2026-04-20 03:58:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:48.671998 | orchestrator | 2026-04-20 03:58:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:48.673902 | orchestrator | 2026-04-20 03:58:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:48.674129 | orchestrator | 2026-04-20 03:58:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:51.717938 | orchestrator | 2026-04-20 03:58:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:51.719304 | orchestrator | 2026-04-20 03:58:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:51.719349 | orchestrator | 2026-04-20 03:58:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:54.764283 | orchestrator | 2026-04-20 03:58:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:54.766820 | orchestrator | 2026-04-20 03:58:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:54.766874 | orchestrator | 2026-04-20 03:58:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:58:57.812355 | orchestrator | 2026-04-20 03:58:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:58:57.814314 | orchestrator | 2026-04-20 03:58:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:58:57.814387 | orchestrator | 2026-04-20 03:58:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:00.859879 | orchestrator | 2026-04-20 03:59:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:00.861868 | orchestrator | 2026-04-20 03:59:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:00.861926 | orchestrator | 2026-04-20 03:59:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:03.905066 | orchestrator | 2026-04-20 03:59:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:03.907757 | orchestrator | 2026-04-20 03:59:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:03.907838 | orchestrator | 2026-04-20 03:59:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:06.950563 | orchestrator | 2026-04-20 03:59:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:06.952190 | orchestrator | 2026-04-20 03:59:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:06.952362 | orchestrator | 2026-04-20 03:59:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:09.999789 | orchestrator | 2026-04-20 03:59:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:10.001963 | orchestrator | 2026-04-20 03:59:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:10.002241 | orchestrator | 2026-04-20 03:59:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:13.052107 | orchestrator | 2026-04-20 03:59:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:13.054085 | orchestrator | 2026-04-20 03:59:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:13.054109 | orchestrator | 2026-04-20 03:59:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:16.093827 | orchestrator | 2026-04-20 03:59:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:16.096641 | orchestrator | 2026-04-20 03:59:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:16.096833 | orchestrator | 2026-04-20 03:59:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:19.146202 | orchestrator | 2026-04-20 03:59:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:19.147890 | orchestrator | 2026-04-20 03:59:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:19.147928 | orchestrator | 2026-04-20 03:59:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:22.193865 | orchestrator | 2026-04-20 03:59:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:22.196537 | orchestrator | 2026-04-20 03:59:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:22.196606 | orchestrator | 2026-04-20 03:59:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:25.243520 | orchestrator | 2026-04-20 03:59:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:25.245460 | orchestrator | 2026-04-20 03:59:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:25.245489 | orchestrator | 2026-04-20 03:59:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:28.293220 | orchestrator | 2026-04-20 03:59:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:28.294824 | orchestrator | 2026-04-20 03:59:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:28.294867 | orchestrator | 2026-04-20 03:59:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:31.346290 | orchestrator | 2026-04-20 03:59:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:31.347308 | orchestrator | 2026-04-20 03:59:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:31.347400 | orchestrator | 2026-04-20 03:59:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:34.389375 | orchestrator | 2026-04-20 03:59:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:34.390922 | orchestrator | 2026-04-20 03:59:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:34.390977 | orchestrator | 2026-04-20 03:59:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:37.439396 | orchestrator | 2026-04-20 03:59:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:37.441136 | orchestrator | 2026-04-20 03:59:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:37.441199 | orchestrator | 2026-04-20 03:59:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:40.488262 | orchestrator | 2026-04-20 03:59:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:40.490960 | orchestrator | 2026-04-20 03:59:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:40.491044 | orchestrator | 2026-04-20 03:59:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:43.535676 | orchestrator | 2026-04-20 03:59:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:43.537695 | orchestrator | 2026-04-20 03:59:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:43.537859 | orchestrator | 2026-04-20 03:59:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:46.588854 | orchestrator | 2026-04-20 03:59:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:46.589731 | orchestrator | 2026-04-20 03:59:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:46.589765 | orchestrator | 2026-04-20 03:59:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:49.630970 | orchestrator | 2026-04-20 03:59:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:49.632844 | orchestrator | 2026-04-20 03:59:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:49.632907 | orchestrator | 2026-04-20 03:59:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:52.677660 | orchestrator | 2026-04-20 03:59:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:52.679215 | orchestrator | 2026-04-20 03:59:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:52.679265 | orchestrator | 2026-04-20 03:59:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:55.724428 | orchestrator | 2026-04-20 03:59:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:55.726508 | orchestrator | 2026-04-20 03:59:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:55.726570 | orchestrator | 2026-04-20 03:59:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 03:59:58.768291 | orchestrator | 2026-04-20 03:59:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 03:59:58.769926 | orchestrator | 2026-04-20 03:59:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 03:59:58.770152 | orchestrator | 2026-04-20 03:59:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:01.812593 | orchestrator | 2026-04-20 04:00:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:01.814232 | orchestrator | 2026-04-20 04:00:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:01.814299 | orchestrator | 2026-04-20 04:00:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:04.863649 | orchestrator | 2026-04-20 04:00:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:04.865710 | orchestrator | 2026-04-20 04:00:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:04.865808 | orchestrator | 2026-04-20 04:00:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:07.907589 | orchestrator | 2026-04-20 04:00:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:07.910005 | orchestrator | 2026-04-20 04:00:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:07.910372 | orchestrator | 2026-04-20 04:00:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:10.954923 | orchestrator | 2026-04-20 04:00:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:10.956882 | orchestrator | 2026-04-20 04:00:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:10.956931 | orchestrator | 2026-04-20 04:00:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:14.005479 | orchestrator | 2026-04-20 04:00:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:14.007625 | orchestrator | 2026-04-20 04:00:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:14.007912 | orchestrator | 2026-04-20 04:00:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:17.055829 | orchestrator | 2026-04-20 04:00:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:17.057451 | orchestrator | 2026-04-20 04:00:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:17.057504 | orchestrator | 2026-04-20 04:00:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:20.101248 | orchestrator | 2026-04-20 04:00:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:20.105562 | orchestrator | 2026-04-20 04:00:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:20.105644 | orchestrator | 2026-04-20 04:00:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:23.153573 | orchestrator | 2026-04-20 04:00:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:23.156316 | orchestrator | 2026-04-20 04:00:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:23.156431 | orchestrator | 2026-04-20 04:00:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:26.199342 | orchestrator | 2026-04-20 04:00:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:26.201187 | orchestrator | 2026-04-20 04:00:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:26.201252 | orchestrator | 2026-04-20 04:00:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:29.251149 | orchestrator | 2026-04-20 04:00:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:29.252867 | orchestrator | 2026-04-20 04:00:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:29.252936 | orchestrator | 2026-04-20 04:00:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:32.296682 | orchestrator | 2026-04-20 04:00:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:32.297915 | orchestrator | 2026-04-20 04:00:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:32.298008 | orchestrator | 2026-04-20 04:00:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:35.345492 | orchestrator | 2026-04-20 04:00:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:35.347004 | orchestrator | 2026-04-20 04:00:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:35.347107 | orchestrator | 2026-04-20 04:00:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:38.397216 | orchestrator | 2026-04-20 04:00:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:38.400946 | orchestrator | 2026-04-20 04:00:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:38.401028 | orchestrator | 2026-04-20 04:00:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:41.445621 | orchestrator | 2026-04-20 04:00:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:41.447167 | orchestrator | 2026-04-20 04:00:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:41.447645 | orchestrator | 2026-04-20 04:00:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:44.486853 | orchestrator | 2026-04-20 04:00:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:44.488565 | orchestrator | 2026-04-20 04:00:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:44.488609 | orchestrator | 2026-04-20 04:00:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:47.534920 | orchestrator | 2026-04-20 04:00:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:47.538366 | orchestrator | 2026-04-20 04:00:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:47.538505 | orchestrator | 2026-04-20 04:00:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:50.585333 | orchestrator | 2026-04-20 04:00:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:50.586512 | orchestrator | 2026-04-20 04:00:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:50.586610 | orchestrator | 2026-04-20 04:00:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:53.633164 | orchestrator | 2026-04-20 04:00:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:53.635017 | orchestrator | 2026-04-20 04:00:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:53.635136 | orchestrator | 2026-04-20 04:00:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:56.681052 | orchestrator | 2026-04-20 04:00:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:56.683045 | orchestrator | 2026-04-20 04:00:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:56.683123 | orchestrator | 2026-04-20 04:00:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:00:59.726803 | orchestrator | 2026-04-20 04:00:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:00:59.727913 | orchestrator | 2026-04-20 04:00:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:00:59.727960 | orchestrator | 2026-04-20 04:00:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:02.772687 | orchestrator | 2026-04-20 04:01:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:02.773891 | orchestrator | 2026-04-20 04:01:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:02.773932 | orchestrator | 2026-04-20 04:01:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:05.817164 | orchestrator | 2026-04-20 04:01:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:05.818650 | orchestrator | 2026-04-20 04:01:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:05.818728 | orchestrator | 2026-04-20 04:01:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:08.861266 | orchestrator | 2026-04-20 04:01:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:08.861866 | orchestrator | 2026-04-20 04:01:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:08.861922 | orchestrator | 2026-04-20 04:01:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:11.909190 | orchestrator | 2026-04-20 04:01:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:11.910599 | orchestrator | 2026-04-20 04:01:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:11.910689 | orchestrator | 2026-04-20 04:01:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:14.957464 | orchestrator | 2026-04-20 04:01:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:14.960118 | orchestrator | 2026-04-20 04:01:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:14.960156 | orchestrator | 2026-04-20 04:01:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:18.010767 | orchestrator | 2026-04-20 04:01:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:18.013945 | orchestrator | 2026-04-20 04:01:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:18.014124 | orchestrator | 2026-04-20 04:01:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:21.054903 | orchestrator | 2026-04-20 04:01:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:21.055624 | orchestrator | 2026-04-20 04:01:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:21.055674 | orchestrator | 2026-04-20 04:01:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:24.095390 | orchestrator | 2026-04-20 04:01:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:24.097199 | orchestrator | 2026-04-20 04:01:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:24.097287 | orchestrator | 2026-04-20 04:01:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:27.141033 | orchestrator | 2026-04-20 04:01:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:27.141265 | orchestrator | 2026-04-20 04:01:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:27.141302 | orchestrator | 2026-04-20 04:01:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:30.186113 | orchestrator | 2026-04-20 04:01:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:30.187890 | orchestrator | 2026-04-20 04:01:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:30.187934 | orchestrator | 2026-04-20 04:01:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:33.238361 | orchestrator | 2026-04-20 04:01:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:33.240703 | orchestrator | 2026-04-20 04:01:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:33.240777 | orchestrator | 2026-04-20 04:01:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:36.283730 | orchestrator | 2026-04-20 04:01:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:36.285169 | orchestrator | 2026-04-20 04:01:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:36.285220 | orchestrator | 2026-04-20 04:01:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:39.327628 | orchestrator | 2026-04-20 04:01:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:39.328583 | orchestrator | 2026-04-20 04:01:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:39.328639 | orchestrator | 2026-04-20 04:01:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:42.376620 | orchestrator | 2026-04-20 04:01:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:42.377789 | orchestrator | 2026-04-20 04:01:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:42.377880 | orchestrator | 2026-04-20 04:01:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:45.425031 | orchestrator | 2026-04-20 04:01:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:45.428461 | orchestrator | 2026-04-20 04:01:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:45.428517 | orchestrator | 2026-04-20 04:01:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:48.476484 | orchestrator | 2026-04-20 04:01:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:48.478684 | orchestrator | 2026-04-20 04:01:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:48.478726 | orchestrator | 2026-04-20 04:01:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:51.524922 | orchestrator | 2026-04-20 04:01:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:51.525144 | orchestrator | 2026-04-20 04:01:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:51.525165 | orchestrator | 2026-04-20 04:01:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:54.569488 | orchestrator | 2026-04-20 04:01:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:54.571373 | orchestrator | 2026-04-20 04:01:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:54.571452 | orchestrator | 2026-04-20 04:01:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:01:57.615970 | orchestrator | 2026-04-20 04:01:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:01:57.617226 | orchestrator | 2026-04-20 04:01:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:01:57.617292 | orchestrator | 2026-04-20 04:01:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:00.666141 | orchestrator | 2026-04-20 04:02:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:00.667907 | orchestrator | 2026-04-20 04:02:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:00.667979 | orchestrator | 2026-04-20 04:02:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:03.711995 | orchestrator | 2026-04-20 04:02:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:03.713492 | orchestrator | 2026-04-20 04:02:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:03.713551 | orchestrator | 2026-04-20 04:02:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:06.762987 | orchestrator | 2026-04-20 04:02:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:06.765942 | orchestrator | 2026-04-20 04:02:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:06.766078 | orchestrator | 2026-04-20 04:02:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:09.811730 | orchestrator | 2026-04-20 04:02:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:09.813521 | orchestrator | 2026-04-20 04:02:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:09.813558 | orchestrator | 2026-04-20 04:02:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:12.864121 | orchestrator | 2026-04-20 04:02:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:12.866888 | orchestrator | 2026-04-20 04:02:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:12.866958 | orchestrator | 2026-04-20 04:02:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:15.913436 | orchestrator | 2026-04-20 04:02:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:15.915469 | orchestrator | 2026-04-20 04:02:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:15.915535 | orchestrator | 2026-04-20 04:02:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:18.959950 | orchestrator | 2026-04-20 04:02:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:18.962678 | orchestrator | 2026-04-20 04:02:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:18.962933 | orchestrator | 2026-04-20 04:02:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:22.010845 | orchestrator | 2026-04-20 04:02:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:22.013855 | orchestrator | 2026-04-20 04:02:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:22.013960 | orchestrator | 2026-04-20 04:02:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:25.060369 | orchestrator | 2026-04-20 04:02:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:25.062884 | orchestrator | 2026-04-20 04:02:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:25.063303 | orchestrator | 2026-04-20 04:02:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:28.109605 | orchestrator | 2026-04-20 04:02:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:28.112541 | orchestrator | 2026-04-20 04:02:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:28.112612 | orchestrator | 2026-04-20 04:02:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:31.157017 | orchestrator | 2026-04-20 04:02:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:31.159387 | orchestrator | 2026-04-20 04:02:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:31.159453 | orchestrator | 2026-04-20 04:02:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:34.198921 | orchestrator | 2026-04-20 04:02:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:34.200336 | orchestrator | 2026-04-20 04:02:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:34.200414 | orchestrator | 2026-04-20 04:02:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:37.245036 | orchestrator | 2026-04-20 04:02:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:37.246885 | orchestrator | 2026-04-20 04:02:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:37.247008 | orchestrator | 2026-04-20 04:02:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:40.291580 | orchestrator | 2026-04-20 04:02:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:40.292309 | orchestrator | 2026-04-20 04:02:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:40.292661 | orchestrator | 2026-04-20 04:02:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:43.337252 | orchestrator | 2026-04-20 04:02:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:43.339313 | orchestrator | 2026-04-20 04:02:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:43.339388 | orchestrator | 2026-04-20 04:02:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:46.384343 | orchestrator | 2026-04-20 04:02:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:46.385993 | orchestrator | 2026-04-20 04:02:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:46.386178 | orchestrator | 2026-04-20 04:02:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:49.431835 | orchestrator | 2026-04-20 04:02:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:49.434339 | orchestrator | 2026-04-20 04:02:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:49.434427 | orchestrator | 2026-04-20 04:02:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:52.478433 | orchestrator | 2026-04-20 04:02:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:52.479231 | orchestrator | 2026-04-20 04:02:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:52.479259 | orchestrator | 2026-04-20 04:02:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:55.523395 | orchestrator | 2026-04-20 04:02:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:55.524092 | orchestrator | 2026-04-20 04:02:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:55.524298 | orchestrator | 2026-04-20 04:02:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:02:58.570698 | orchestrator | 2026-04-20 04:02:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:02:58.573084 | orchestrator | 2026-04-20 04:02:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:02:58.573148 | orchestrator | 2026-04-20 04:02:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:01.609814 | orchestrator | 2026-04-20 04:03:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:01.611781 | orchestrator | 2026-04-20 04:03:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:01.611887 | orchestrator | 2026-04-20 04:03:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:04.657375 | orchestrator | 2026-04-20 04:03:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:04.659195 | orchestrator | 2026-04-20 04:03:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:04.659237 | orchestrator | 2026-04-20 04:03:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:07.706458 | orchestrator | 2026-04-20 04:03:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:07.707862 | orchestrator | 2026-04-20 04:03:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:07.707899 | orchestrator | 2026-04-20 04:03:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:10.753133 | orchestrator | 2026-04-20 04:03:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:10.755958 | orchestrator | 2026-04-20 04:03:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:10.756041 | orchestrator | 2026-04-20 04:03:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:13.803163 | orchestrator | 2026-04-20 04:03:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:13.804771 | orchestrator | 2026-04-20 04:03:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:13.804857 | orchestrator | 2026-04-20 04:03:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:16.846479 | orchestrator | 2026-04-20 04:03:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:16.848704 | orchestrator | 2026-04-20 04:03:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:16.848855 | orchestrator | 2026-04-20 04:03:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:19.895722 | orchestrator | 2026-04-20 04:03:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:19.899319 | orchestrator | 2026-04-20 04:03:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:19.899363 | orchestrator | 2026-04-20 04:03:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:22.942356 | orchestrator | 2026-04-20 04:03:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:22.944997 | orchestrator | 2026-04-20 04:03:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:22.945039 | orchestrator | 2026-04-20 04:03:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:25.988434 | orchestrator | 2026-04-20 04:03:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:25.991018 | orchestrator | 2026-04-20 04:03:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:25.991096 | orchestrator | 2026-04-20 04:03:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:29.038743 | orchestrator | 2026-04-20 04:03:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:29.040597 | orchestrator | 2026-04-20 04:03:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:29.040661 | orchestrator | 2026-04-20 04:03:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:32.078275 | orchestrator | 2026-04-20 04:03:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:32.080431 | orchestrator | 2026-04-20 04:03:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:32.080566 | orchestrator | 2026-04-20 04:03:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:35.124619 | orchestrator | 2026-04-20 04:03:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:35.126455 | orchestrator | 2026-04-20 04:03:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:35.126552 | orchestrator | 2026-04-20 04:03:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:38.169641 | orchestrator | 2026-04-20 04:03:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:38.171587 | orchestrator | 2026-04-20 04:03:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:38.171624 | orchestrator | 2026-04-20 04:03:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:41.214168 | orchestrator | 2026-04-20 04:03:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:41.215839 | orchestrator | 2026-04-20 04:03:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:41.215907 | orchestrator | 2026-04-20 04:03:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:44.260300 | orchestrator | 2026-04-20 04:03:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:44.261950 | orchestrator | 2026-04-20 04:03:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:44.261996 | orchestrator | 2026-04-20 04:03:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:47.302234 | orchestrator | 2026-04-20 04:03:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:47.304689 | orchestrator | 2026-04-20 04:03:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:47.304736 | orchestrator | 2026-04-20 04:03:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:50.348295 | orchestrator | 2026-04-20 04:03:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:50.351387 | orchestrator | 2026-04-20 04:03:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:50.351466 | orchestrator | 2026-04-20 04:03:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:53.390325 | orchestrator | 2026-04-20 04:03:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:53.392907 | orchestrator | 2026-04-20 04:03:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:53.392970 | orchestrator | 2026-04-20 04:03:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:56.432489 | orchestrator | 2026-04-20 04:03:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:56.433593 | orchestrator | 2026-04-20 04:03:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:56.433808 | orchestrator | 2026-04-20 04:03:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:03:59.471723 | orchestrator | 2026-04-20 04:03:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:03:59.472679 | orchestrator | 2026-04-20 04:03:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:03:59.472733 | orchestrator | 2026-04-20 04:03:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:02.514734 | orchestrator | 2026-04-20 04:04:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:02.515831 | orchestrator | 2026-04-20 04:04:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:02.515921 | orchestrator | 2026-04-20 04:04:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:05.561764 | orchestrator | 2026-04-20 04:04:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:05.563587 | orchestrator | 2026-04-20 04:04:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:05.563663 | orchestrator | 2026-04-20 04:04:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:08.615300 | orchestrator | 2026-04-20 04:04:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:08.617450 | orchestrator | 2026-04-20 04:04:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:08.617736 | orchestrator | 2026-04-20 04:04:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:11.664986 | orchestrator | 2026-04-20 04:04:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:11.669120 | orchestrator | 2026-04-20 04:04:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:11.669205 | orchestrator | 2026-04-20 04:04:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:14.715849 | orchestrator | 2026-04-20 04:04:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:14.719273 | orchestrator | 2026-04-20 04:04:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:14.719482 | orchestrator | 2026-04-20 04:04:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:17.768028 | orchestrator | 2026-04-20 04:04:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:17.769659 | orchestrator | 2026-04-20 04:04:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:17.769787 | orchestrator | 2026-04-20 04:04:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:20.825044 | orchestrator | 2026-04-20 04:04:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:20.826655 | orchestrator | 2026-04-20 04:04:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:20.826684 | orchestrator | 2026-04-20 04:04:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:23.878367 | orchestrator | 2026-04-20 04:04:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:23.882401 | orchestrator | 2026-04-20 04:04:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:23.882483 | orchestrator | 2026-04-20 04:04:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:26.933400 | orchestrator | 2026-04-20 04:04:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:26.935003 | orchestrator | 2026-04-20 04:04:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:26.935060 | orchestrator | 2026-04-20 04:04:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:29.986087 | orchestrator | 2026-04-20 04:04:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:29.987141 | orchestrator | 2026-04-20 04:04:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:29.987184 | orchestrator | 2026-04-20 04:04:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:33.043079 | orchestrator | 2026-04-20 04:04:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:33.047201 | orchestrator | 2026-04-20 04:04:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:33.047276 | orchestrator | 2026-04-20 04:04:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:36.102074 | orchestrator | 2026-04-20 04:04:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:36.103720 | orchestrator | 2026-04-20 04:04:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:36.103787 | orchestrator | 2026-04-20 04:04:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:39.157899 | orchestrator | 2026-04-20 04:04:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:39.160509 | orchestrator | 2026-04-20 04:04:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:39.160656 | orchestrator | 2026-04-20 04:04:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:42.203274 | orchestrator | 2026-04-20 04:04:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:42.203376 | orchestrator | 2026-04-20 04:04:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:42.203449 | orchestrator | 2026-04-20 04:04:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:45.253632 | orchestrator | 2026-04-20 04:04:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:45.255930 | orchestrator | 2026-04-20 04:04:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:45.256081 | orchestrator | 2026-04-20 04:04:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:48.304816 | orchestrator | 2026-04-20 04:04:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:48.306793 | orchestrator | 2026-04-20 04:04:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:48.306897 | orchestrator | 2026-04-20 04:04:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:51.362442 | orchestrator | 2026-04-20 04:04:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:51.364704 | orchestrator | 2026-04-20 04:04:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:51.364853 | orchestrator | 2026-04-20 04:04:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:54.410508 | orchestrator | 2026-04-20 04:04:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:54.412467 | orchestrator | 2026-04-20 04:04:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:54.412527 | orchestrator | 2026-04-20 04:04:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:04:57.454215 | orchestrator | 2026-04-20 04:04:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:04:57.457892 | orchestrator | 2026-04-20 04:04:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:04:57.457965 | orchestrator | 2026-04-20 04:04:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:00.501411 | orchestrator | 2026-04-20 04:05:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:00.503411 | orchestrator | 2026-04-20 04:05:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:00.503465 | orchestrator | 2026-04-20 04:05:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:03.551768 | orchestrator | 2026-04-20 04:05:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:03.553051 | orchestrator | 2026-04-20 04:05:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:03.553107 | orchestrator | 2026-04-20 04:05:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:06.601405 | orchestrator | 2026-04-20 04:05:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:06.603802 | orchestrator | 2026-04-20 04:05:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:06.603871 | orchestrator | 2026-04-20 04:05:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:09.650718 | orchestrator | 2026-04-20 04:05:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:09.652327 | orchestrator | 2026-04-20 04:05:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:09.652395 | orchestrator | 2026-04-20 04:05:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:12.697713 | orchestrator | 2026-04-20 04:05:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:12.699528 | orchestrator | 2026-04-20 04:05:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:12.699674 | orchestrator | 2026-04-20 04:05:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:15.745852 | orchestrator | 2026-04-20 04:05:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:15.746210 | orchestrator | 2026-04-20 04:05:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:15.746295 | orchestrator | 2026-04-20 04:05:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:18.791538 | orchestrator | 2026-04-20 04:05:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:18.793214 | orchestrator | 2026-04-20 04:05:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:18.793289 | orchestrator | 2026-04-20 04:05:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:21.840465 | orchestrator | 2026-04-20 04:05:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:21.842460 | orchestrator | 2026-04-20 04:05:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:21.843134 | orchestrator | 2026-04-20 04:05:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:24.887556 | orchestrator | 2026-04-20 04:05:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:24.888929 | orchestrator | 2026-04-20 04:05:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:24.889007 | orchestrator | 2026-04-20 04:05:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:27.935085 | orchestrator | 2026-04-20 04:05:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:27.939979 | orchestrator | 2026-04-20 04:05:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:27.940034 | orchestrator | 2026-04-20 04:05:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:30.984222 | orchestrator | 2026-04-20 04:05:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:30.986154 | orchestrator | 2026-04-20 04:05:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:30.986210 | orchestrator | 2026-04-20 04:05:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:34.028918 | orchestrator | 2026-04-20 04:05:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:34.032812 | orchestrator | 2026-04-20 04:05:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:34.032920 | orchestrator | 2026-04-20 04:05:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:37.077906 | orchestrator | 2026-04-20 04:05:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:37.078874 | orchestrator | 2026-04-20 04:05:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:37.078937 | orchestrator | 2026-04-20 04:05:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:40.126122 | orchestrator | 2026-04-20 04:05:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:40.127777 | orchestrator | 2026-04-20 04:05:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:40.127959 | orchestrator | 2026-04-20 04:05:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:43.176257 | orchestrator | 2026-04-20 04:05:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:43.177377 | orchestrator | 2026-04-20 04:05:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:43.177634 | orchestrator | 2026-04-20 04:05:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:46.227320 | orchestrator | 2026-04-20 04:05:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:46.229641 | orchestrator | 2026-04-20 04:05:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:46.229707 | orchestrator | 2026-04-20 04:05:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:49.275084 | orchestrator | 2026-04-20 04:05:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:49.276477 | orchestrator | 2026-04-20 04:05:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:49.276547 | orchestrator | 2026-04-20 04:05:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:52.326267 | orchestrator | 2026-04-20 04:05:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:52.328045 | orchestrator | 2026-04-20 04:05:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:52.328140 | orchestrator | 2026-04-20 04:05:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:55.376238 | orchestrator | 2026-04-20 04:05:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:55.377860 | orchestrator | 2026-04-20 04:05:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:55.377930 | orchestrator | 2026-04-20 04:05:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:05:58.430475 | orchestrator | 2026-04-20 04:05:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:05:58.432384 | orchestrator | 2026-04-20 04:05:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:05:58.432435 | orchestrator | 2026-04-20 04:05:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:01.478171 | orchestrator | 2026-04-20 04:06:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:01.480064 | orchestrator | 2026-04-20 04:06:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:01.480118 | orchestrator | 2026-04-20 04:06:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:04.528654 | orchestrator | 2026-04-20 04:06:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:04.530341 | orchestrator | 2026-04-20 04:06:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:04.530393 | orchestrator | 2026-04-20 04:06:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:07.575751 | orchestrator | 2026-04-20 04:06:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:07.577381 | orchestrator | 2026-04-20 04:06:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:07.577675 | orchestrator | 2026-04-20 04:06:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:10.618831 | orchestrator | 2026-04-20 04:06:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:10.619294 | orchestrator | 2026-04-20 04:06:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:10.619309 | orchestrator | 2026-04-20 04:06:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:13.666405 | orchestrator | 2026-04-20 04:06:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:13.668749 | orchestrator | 2026-04-20 04:06:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:13.668897 | orchestrator | 2026-04-20 04:06:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:16.715491 | orchestrator | 2026-04-20 04:06:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:16.717718 | orchestrator | 2026-04-20 04:06:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:16.717800 | orchestrator | 2026-04-20 04:06:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:19.759310 | orchestrator | 2026-04-20 04:06:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:19.760426 | orchestrator | 2026-04-20 04:06:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:19.760461 | orchestrator | 2026-04-20 04:06:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:22.815960 | orchestrator | 2026-04-20 04:06:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:22.817832 | orchestrator | 2026-04-20 04:06:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:22.817913 | orchestrator | 2026-04-20 04:06:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:25.866443 | orchestrator | 2026-04-20 04:06:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:25.868440 | orchestrator | 2026-04-20 04:06:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:25.868525 | orchestrator | 2026-04-20 04:06:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:28.923263 | orchestrator | 2026-04-20 04:06:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:28.926577 | orchestrator | 2026-04-20 04:06:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:28.926822 | orchestrator | 2026-04-20 04:06:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:31.971840 | orchestrator | 2026-04-20 04:06:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:31.974524 | orchestrator | 2026-04-20 04:06:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:31.974613 | orchestrator | 2026-04-20 04:06:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:35.027336 | orchestrator | 2026-04-20 04:06:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:35.028336 | orchestrator | 2026-04-20 04:06:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:35.028368 | orchestrator | 2026-04-20 04:06:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:38.084237 | orchestrator | 2026-04-20 04:06:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:38.086791 | orchestrator | 2026-04-20 04:06:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:38.086895 | orchestrator | 2026-04-20 04:06:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:41.133148 | orchestrator | 2026-04-20 04:06:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:41.136516 | orchestrator | 2026-04-20 04:06:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:41.136587 | orchestrator | 2026-04-20 04:06:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:44.182857 | orchestrator | 2026-04-20 04:06:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:44.184414 | orchestrator | 2026-04-20 04:06:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:44.184518 | orchestrator | 2026-04-20 04:06:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:47.238520 | orchestrator | 2026-04-20 04:06:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:47.240097 | orchestrator | 2026-04-20 04:06:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:47.240145 | orchestrator | 2026-04-20 04:06:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:50.288392 | orchestrator | 2026-04-20 04:06:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:50.288516 | orchestrator | 2026-04-20 04:06:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:50.288543 | orchestrator | 2026-04-20 04:06:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:53.338331 | orchestrator | 2026-04-20 04:06:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:53.340615 | orchestrator | 2026-04-20 04:06:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:53.340755 | orchestrator | 2026-04-20 04:06:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:56.391543 | orchestrator | 2026-04-20 04:06:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:56.393580 | orchestrator | 2026-04-20 04:06:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:56.393696 | orchestrator | 2026-04-20 04:06:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:06:59.443793 | orchestrator | 2026-04-20 04:06:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:06:59.445330 | orchestrator | 2026-04-20 04:06:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:06:59.445764 | orchestrator | 2026-04-20 04:06:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:02.496064 | orchestrator | 2026-04-20 04:07:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:02.497309 | orchestrator | 2026-04-20 04:07:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:02.497351 | orchestrator | 2026-04-20 04:07:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:05.543891 | orchestrator | 2026-04-20 04:07:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:05.547095 | orchestrator | 2026-04-20 04:07:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:05.547169 | orchestrator | 2026-04-20 04:07:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:08.595630 | orchestrator | 2026-04-20 04:07:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:08.598429 | orchestrator | 2026-04-20 04:07:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:08.598500 | orchestrator | 2026-04-20 04:07:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:11.648609 | orchestrator | 2026-04-20 04:07:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:11.650347 | orchestrator | 2026-04-20 04:07:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:11.650395 | orchestrator | 2026-04-20 04:07:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:14.713032 | orchestrator | 2026-04-20 04:07:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:14.714675 | orchestrator | 2026-04-20 04:07:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:14.714730 | orchestrator | 2026-04-20 04:07:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:17.759254 | orchestrator | 2026-04-20 04:07:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:17.761528 | orchestrator | 2026-04-20 04:07:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:17.761594 | orchestrator | 2026-04-20 04:07:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:20.807569 | orchestrator | 2026-04-20 04:07:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:20.810543 | orchestrator | 2026-04-20 04:07:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:20.810872 | orchestrator | 2026-04-20 04:07:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:23.861467 | orchestrator | 2026-04-20 04:07:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:23.862970 | orchestrator | 2026-04-20 04:07:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:23.863028 | orchestrator | 2026-04-20 04:07:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:26.913256 | orchestrator | 2026-04-20 04:07:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:26.914869 | orchestrator | 2026-04-20 04:07:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:26.914918 | orchestrator | 2026-04-20 04:07:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:29.965799 | orchestrator | 2026-04-20 04:07:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:29.967689 | orchestrator | 2026-04-20 04:07:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:29.967751 | orchestrator | 2026-04-20 04:07:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:33.016203 | orchestrator | 2026-04-20 04:07:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:33.017821 | orchestrator | 2026-04-20 04:07:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:33.017922 | orchestrator | 2026-04-20 04:07:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:36.068365 | orchestrator | 2026-04-20 04:07:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:36.070321 | orchestrator | 2026-04-20 04:07:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:36.070393 | orchestrator | 2026-04-20 04:07:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:39.126381 | orchestrator | 2026-04-20 04:07:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:39.127934 | orchestrator | 2026-04-20 04:07:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:39.127968 | orchestrator | 2026-04-20 04:07:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:42.171323 | orchestrator | 2026-04-20 04:07:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:42.172679 | orchestrator | 2026-04-20 04:07:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:42.172737 | orchestrator | 2026-04-20 04:07:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:45.225606 | orchestrator | 2026-04-20 04:07:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:45.227360 | orchestrator | 2026-04-20 04:07:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:45.227403 | orchestrator | 2026-04-20 04:07:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:48.274049 | orchestrator | 2026-04-20 04:07:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:48.275137 | orchestrator | 2026-04-20 04:07:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:48.275217 | orchestrator | 2026-04-20 04:07:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:51.317850 | orchestrator | 2026-04-20 04:07:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:51.319249 | orchestrator | 2026-04-20 04:07:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:51.319315 | orchestrator | 2026-04-20 04:07:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:54.368925 | orchestrator | 2026-04-20 04:07:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:54.370114 | orchestrator | 2026-04-20 04:07:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:54.370192 | orchestrator | 2026-04-20 04:07:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:07:57.409825 | orchestrator | 2026-04-20 04:07:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:07:57.411244 | orchestrator | 2026-04-20 04:07:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:07:57.411358 | orchestrator | 2026-04-20 04:07:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:00.454467 | orchestrator | 2026-04-20 04:08:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:00.459165 | orchestrator | 2026-04-20 04:08:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:00.459222 | orchestrator | 2026-04-20 04:08:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:03.505405 | orchestrator | 2026-04-20 04:08:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:03.507967 | orchestrator | 2026-04-20 04:08:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:03.508037 | orchestrator | 2026-04-20 04:08:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:06.553970 | orchestrator | 2026-04-20 04:08:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:06.557204 | orchestrator | 2026-04-20 04:08:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:06.557296 | orchestrator | 2026-04-20 04:08:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:09.607390 | orchestrator | 2026-04-20 04:08:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:09.609933 | orchestrator | 2026-04-20 04:08:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:09.610121 | orchestrator | 2026-04-20 04:08:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:12.657334 | orchestrator | 2026-04-20 04:08:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:12.659948 | orchestrator | 2026-04-20 04:08:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:12.660121 | orchestrator | 2026-04-20 04:08:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:15.710354 | orchestrator | 2026-04-20 04:08:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:15.711885 | orchestrator | 2026-04-20 04:08:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:15.711934 | orchestrator | 2026-04-20 04:08:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:18.776730 | orchestrator | 2026-04-20 04:08:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:18.778529 | orchestrator | 2026-04-20 04:08:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:18.778602 | orchestrator | 2026-04-20 04:08:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:21.822641 | orchestrator | 2026-04-20 04:08:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:21.825150 | orchestrator | 2026-04-20 04:08:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:21.825235 | orchestrator | 2026-04-20 04:08:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:24.876220 | orchestrator | 2026-04-20 04:08:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:24.877673 | orchestrator | 2026-04-20 04:08:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:24.877772 | orchestrator | 2026-04-20 04:08:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:27.925591 | orchestrator | 2026-04-20 04:08:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:27.927264 | orchestrator | 2026-04-20 04:08:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:27.927304 | orchestrator | 2026-04-20 04:08:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:30.975678 | orchestrator | 2026-04-20 04:08:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:30.977963 | orchestrator | 2026-04-20 04:08:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:30.978297 | orchestrator | 2026-04-20 04:08:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:34.032035 | orchestrator | 2026-04-20 04:08:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:34.033297 | orchestrator | 2026-04-20 04:08:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:34.033347 | orchestrator | 2026-04-20 04:08:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:37.082953 | orchestrator | 2026-04-20 04:08:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:37.084487 | orchestrator | 2026-04-20 04:08:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:37.084527 | orchestrator | 2026-04-20 04:08:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:40.127678 | orchestrator | 2026-04-20 04:08:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:40.128870 | orchestrator | 2026-04-20 04:08:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:40.128899 | orchestrator | 2026-04-20 04:08:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:43.172037 | orchestrator | 2026-04-20 04:08:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:43.173056 | orchestrator | 2026-04-20 04:08:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:43.173091 | orchestrator | 2026-04-20 04:08:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:46.219825 | orchestrator | 2026-04-20 04:08:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:46.221052 | orchestrator | 2026-04-20 04:08:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:46.221091 | orchestrator | 2026-04-20 04:08:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:49.262910 | orchestrator | 2026-04-20 04:08:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:49.264181 | orchestrator | 2026-04-20 04:08:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:49.264239 | orchestrator | 2026-04-20 04:08:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:52.314402 | orchestrator | 2026-04-20 04:08:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:52.317472 | orchestrator | 2026-04-20 04:08:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:52.317661 | orchestrator | 2026-04-20 04:08:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:55.365911 | orchestrator | 2026-04-20 04:08:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:55.367623 | orchestrator | 2026-04-20 04:08:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:55.367918 | orchestrator | 2026-04-20 04:08:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:08:58.417111 | orchestrator | 2026-04-20 04:08:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:08:58.419106 | orchestrator | 2026-04-20 04:08:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:08:58.419155 | orchestrator | 2026-04-20 04:08:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:01.464855 | orchestrator | 2026-04-20 04:09:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:01.465921 | orchestrator | 2026-04-20 04:09:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:01.466111 | orchestrator | 2026-04-20 04:09:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:04.512542 | orchestrator | 2026-04-20 04:09:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:04.514430 | orchestrator | 2026-04-20 04:09:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:04.514660 | orchestrator | 2026-04-20 04:09:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:07.563159 | orchestrator | 2026-04-20 04:09:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:07.567014 | orchestrator | 2026-04-20 04:09:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:07.567193 | orchestrator | 2026-04-20 04:09:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:10.615626 | orchestrator | 2026-04-20 04:09:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:10.616965 | orchestrator | 2026-04-20 04:09:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:10.617090 | orchestrator | 2026-04-20 04:09:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:13.664331 | orchestrator | 2026-04-20 04:09:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:13.664962 | orchestrator | 2026-04-20 04:09:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:13.665001 | orchestrator | 2026-04-20 04:09:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:16.704838 | orchestrator | 2026-04-20 04:09:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:16.706875 | orchestrator | 2026-04-20 04:09:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:16.706923 | orchestrator | 2026-04-20 04:09:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:19.752884 | orchestrator | 2026-04-20 04:09:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:19.755526 | orchestrator | 2026-04-20 04:09:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:19.755598 | orchestrator | 2026-04-20 04:09:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:22.803176 | orchestrator | 2026-04-20 04:09:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:22.805296 | orchestrator | 2026-04-20 04:09:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:22.805334 | orchestrator | 2026-04-20 04:09:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:25.858333 | orchestrator | 2026-04-20 04:09:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:25.861463 | orchestrator | 2026-04-20 04:09:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:25.861639 | orchestrator | 2026-04-20 04:09:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:28.917400 | orchestrator | 2026-04-20 04:09:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:28.919835 | orchestrator | 2026-04-20 04:09:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:28.919931 | orchestrator | 2026-04-20 04:09:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:31.964435 | orchestrator | 2026-04-20 04:09:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:31.967215 | orchestrator | 2026-04-20 04:09:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:31.967250 | orchestrator | 2026-04-20 04:09:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:35.016586 | orchestrator | 2026-04-20 04:09:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:35.018423 | orchestrator | 2026-04-20 04:09:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:35.018664 | orchestrator | 2026-04-20 04:09:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:38.061418 | orchestrator | 2026-04-20 04:09:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:38.063991 | orchestrator | 2026-04-20 04:09:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:38.064089 | orchestrator | 2026-04-20 04:09:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:41.108948 | orchestrator | 2026-04-20 04:09:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:41.110367 | orchestrator | 2026-04-20 04:09:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:41.110419 | orchestrator | 2026-04-20 04:09:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:44.157527 | orchestrator | 2026-04-20 04:09:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:44.158898 | orchestrator | 2026-04-20 04:09:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:44.158932 | orchestrator | 2026-04-20 04:09:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:47.200819 | orchestrator | 2026-04-20 04:09:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:47.202849 | orchestrator | 2026-04-20 04:09:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:47.203234 | orchestrator | 2026-04-20 04:09:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:50.252296 | orchestrator | 2026-04-20 04:09:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:50.253473 | orchestrator | 2026-04-20 04:09:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:50.253589 | orchestrator | 2026-04-20 04:09:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:53.300891 | orchestrator | 2026-04-20 04:09:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:53.302303 | orchestrator | 2026-04-20 04:09:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:53.302352 | orchestrator | 2026-04-20 04:09:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:56.345137 | orchestrator | 2026-04-20 04:09:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:56.346571 | orchestrator | 2026-04-20 04:09:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:56.346593 | orchestrator | 2026-04-20 04:09:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:09:59.403976 | orchestrator | 2026-04-20 04:09:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:09:59.405671 | orchestrator | 2026-04-20 04:09:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:09:59.405897 | orchestrator | 2026-04-20 04:09:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:02.446523 | orchestrator | 2026-04-20 04:10:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:02.447983 | orchestrator | 2026-04-20 04:10:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:02.448032 | orchestrator | 2026-04-20 04:10:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:05.503526 | orchestrator | 2026-04-20 04:10:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:05.506490 | orchestrator | 2026-04-20 04:10:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:05.506917 | orchestrator | 2026-04-20 04:10:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:08.551613 | orchestrator | 2026-04-20 04:10:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:08.551706 | orchestrator | 2026-04-20 04:10:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:08.551716 | orchestrator | 2026-04-20 04:10:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:11.596628 | orchestrator | 2026-04-20 04:10:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:11.597538 | orchestrator | 2026-04-20 04:10:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:11.597590 | orchestrator | 2026-04-20 04:10:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:14.652362 | orchestrator | 2026-04-20 04:10:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:14.654582 | orchestrator | 2026-04-20 04:10:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:14.655039 | orchestrator | 2026-04-20 04:10:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:17.702249 | orchestrator | 2026-04-20 04:10:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:17.702495 | orchestrator | 2026-04-20 04:10:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:17.702526 | orchestrator | 2026-04-20 04:10:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:20.744065 | orchestrator | 2026-04-20 04:10:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:20.745253 | orchestrator | 2026-04-20 04:10:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:20.745482 | orchestrator | 2026-04-20 04:10:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:23.791584 | orchestrator | 2026-04-20 04:10:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:23.792583 | orchestrator | 2026-04-20 04:10:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:23.792619 | orchestrator | 2026-04-20 04:10:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:26.840979 | orchestrator | 2026-04-20 04:10:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:26.842674 | orchestrator | 2026-04-20 04:10:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:26.842720 | orchestrator | 2026-04-20 04:10:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:29.891372 | orchestrator | 2026-04-20 04:10:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:29.896692 | orchestrator | 2026-04-20 04:10:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:29.896871 | orchestrator | 2026-04-20 04:10:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:32.942510 | orchestrator | 2026-04-20 04:10:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:32.943384 | orchestrator | 2026-04-20 04:10:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:32.943595 | orchestrator | 2026-04-20 04:10:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:35.989126 | orchestrator | 2026-04-20 04:10:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:35.990495 | orchestrator | 2026-04-20 04:10:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:35.990572 | orchestrator | 2026-04-20 04:10:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:39.037705 | orchestrator | 2026-04-20 04:10:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:39.040467 | orchestrator | 2026-04-20 04:10:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:39.040542 | orchestrator | 2026-04-20 04:10:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:42.083632 | orchestrator | 2026-04-20 04:10:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:42.086426 | orchestrator | 2026-04-20 04:10:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:42.086499 | orchestrator | 2026-04-20 04:10:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:45.137847 | orchestrator | 2026-04-20 04:10:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:45.139336 | orchestrator | 2026-04-20 04:10:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:45.139417 | orchestrator | 2026-04-20 04:10:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:48.185368 | orchestrator | 2026-04-20 04:10:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:48.186295 | orchestrator | 2026-04-20 04:10:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:48.186443 | orchestrator | 2026-04-20 04:10:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:51.231411 | orchestrator | 2026-04-20 04:10:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:51.233157 | orchestrator | 2026-04-20 04:10:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:51.233452 | orchestrator | 2026-04-20 04:10:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:54.284563 | orchestrator | 2026-04-20 04:10:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:54.285793 | orchestrator | 2026-04-20 04:10:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:54.285845 | orchestrator | 2026-04-20 04:10:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:10:57.332411 | orchestrator | 2026-04-20 04:10:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:10:57.333475 | orchestrator | 2026-04-20 04:10:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:10:57.333524 | orchestrator | 2026-04-20 04:10:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:00.383709 | orchestrator | 2026-04-20 04:11:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:00.383926 | orchestrator | 2026-04-20 04:11:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:00.383941 | orchestrator | 2026-04-20 04:11:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:03.422269 | orchestrator | 2026-04-20 04:11:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:03.423149 | orchestrator | 2026-04-20 04:11:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:03.423195 | orchestrator | 2026-04-20 04:11:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:06.476257 | orchestrator | 2026-04-20 04:11:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:06.478408 | orchestrator | 2026-04-20 04:11:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:06.478487 | orchestrator | 2026-04-20 04:11:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:09.521615 | orchestrator | 2026-04-20 04:11:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:09.524045 | orchestrator | 2026-04-20 04:11:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:09.524108 | orchestrator | 2026-04-20 04:11:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:12.575107 | orchestrator | 2026-04-20 04:11:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:12.577038 | orchestrator | 2026-04-20 04:11:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:12.577124 | orchestrator | 2026-04-20 04:11:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:15.630604 | orchestrator | 2026-04-20 04:11:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:15.631031 | orchestrator | 2026-04-20 04:11:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:15.631067 | orchestrator | 2026-04-20 04:11:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:18.673246 | orchestrator | 2026-04-20 04:11:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:18.674324 | orchestrator | 2026-04-20 04:11:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:18.674598 | orchestrator | 2026-04-20 04:11:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:21.728563 | orchestrator | 2026-04-20 04:11:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:21.730192 | orchestrator | 2026-04-20 04:11:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:21.730239 | orchestrator | 2026-04-20 04:11:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:24.777442 | orchestrator | 2026-04-20 04:11:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:24.778274 | orchestrator | 2026-04-20 04:11:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:24.778313 | orchestrator | 2026-04-20 04:11:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:27.823112 | orchestrator | 2026-04-20 04:11:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:27.824725 | orchestrator | 2026-04-20 04:11:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:27.824924 | orchestrator | 2026-04-20 04:11:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:30.870189 | orchestrator | 2026-04-20 04:11:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:30.870980 | orchestrator | 2026-04-20 04:11:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:30.871254 | orchestrator | 2026-04-20 04:11:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:33.922382 | orchestrator | 2026-04-20 04:11:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:33.923048 | orchestrator | 2026-04-20 04:11:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:33.923083 | orchestrator | 2026-04-20 04:11:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:36.977528 | orchestrator | 2026-04-20 04:11:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:36.978939 | orchestrator | 2026-04-20 04:11:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:36.978975 | orchestrator | 2026-04-20 04:11:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:40.032179 | orchestrator | 2026-04-20 04:11:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:40.032376 | orchestrator | 2026-04-20 04:11:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:40.032391 | orchestrator | 2026-04-20 04:11:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:43.075959 | orchestrator | 2026-04-20 04:11:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:43.076336 | orchestrator | 2026-04-20 04:11:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:43.076423 | orchestrator | 2026-04-20 04:11:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:46.127560 | orchestrator | 2026-04-20 04:11:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:46.129064 | orchestrator | 2026-04-20 04:11:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:46.129193 | orchestrator | 2026-04-20 04:11:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:49.171879 | orchestrator | 2026-04-20 04:11:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:49.172528 | orchestrator | 2026-04-20 04:11:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:49.172544 | orchestrator | 2026-04-20 04:11:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:52.225835 | orchestrator | 2026-04-20 04:11:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:52.225971 | orchestrator | 2026-04-20 04:11:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:52.226097 | orchestrator | 2026-04-20 04:11:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:55.275537 | orchestrator | 2026-04-20 04:11:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:55.277376 | orchestrator | 2026-04-20 04:11:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:55.277465 | orchestrator | 2026-04-20 04:11:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:11:58.326264 | orchestrator | 2026-04-20 04:11:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:11:58.327574 | orchestrator | 2026-04-20 04:11:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:11:58.328000 | orchestrator | 2026-04-20 04:11:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:01.380035 | orchestrator | 2026-04-20 04:12:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:01.382474 | orchestrator | 2026-04-20 04:12:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:01.382509 | orchestrator | 2026-04-20 04:12:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:04.430338 | orchestrator | 2026-04-20 04:12:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:04.431983 | orchestrator | 2026-04-20 04:12:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:04.432030 | orchestrator | 2026-04-20 04:12:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:07.474816 | orchestrator | 2026-04-20 04:12:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:07.476015 | orchestrator | 2026-04-20 04:12:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:07.476070 | orchestrator | 2026-04-20 04:12:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:10.521329 | orchestrator | 2026-04-20 04:12:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:10.521945 | orchestrator | 2026-04-20 04:12:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:10.521986 | orchestrator | 2026-04-20 04:12:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:13.569173 | orchestrator | 2026-04-20 04:12:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:13.570209 | orchestrator | 2026-04-20 04:12:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:13.570253 | orchestrator | 2026-04-20 04:12:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:16.618402 | orchestrator | 2026-04-20 04:12:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:16.619542 | orchestrator | 2026-04-20 04:12:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:16.619605 | orchestrator | 2026-04-20 04:12:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:19.666310 | orchestrator | 2026-04-20 04:12:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:19.668685 | orchestrator | 2026-04-20 04:12:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:19.668730 | orchestrator | 2026-04-20 04:12:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:22.718497 | orchestrator | 2026-04-20 04:12:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:22.720691 | orchestrator | 2026-04-20 04:12:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:22.720751 | orchestrator | 2026-04-20 04:12:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:25.764734 | orchestrator | 2026-04-20 04:12:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:25.767611 | orchestrator | 2026-04-20 04:12:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:25.767668 | orchestrator | 2026-04-20 04:12:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:28.804734 | orchestrator | 2026-04-20 04:12:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:28.806566 | orchestrator | 2026-04-20 04:12:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:28.806654 | orchestrator | 2026-04-20 04:12:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:31.852578 | orchestrator | 2026-04-20 04:12:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:31.853731 | orchestrator | 2026-04-20 04:12:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:31.854257 | orchestrator | 2026-04-20 04:12:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:34.908320 | orchestrator | 2026-04-20 04:12:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:34.910312 | orchestrator | 2026-04-20 04:12:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:34.910400 | orchestrator | 2026-04-20 04:12:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:37.963287 | orchestrator | 2026-04-20 04:12:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:37.964596 | orchestrator | 2026-04-20 04:12:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:37.964682 | orchestrator | 2026-04-20 04:12:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:41.017329 | orchestrator | 2026-04-20 04:12:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:41.019416 | orchestrator | 2026-04-20 04:12:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:41.019657 | orchestrator | 2026-04-20 04:12:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:44.066925 | orchestrator | 2026-04-20 04:12:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:44.069963 | orchestrator | 2026-04-20 04:12:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:44.070153 | orchestrator | 2026-04-20 04:12:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:47.120410 | orchestrator | 2026-04-20 04:12:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:47.122131 | orchestrator | 2026-04-20 04:12:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:47.122174 | orchestrator | 2026-04-20 04:12:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:50.165547 | orchestrator | 2026-04-20 04:12:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:50.167374 | orchestrator | 2026-04-20 04:12:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:50.167419 | orchestrator | 2026-04-20 04:12:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:53.212188 | orchestrator | 2026-04-20 04:12:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:53.212642 | orchestrator | 2026-04-20 04:12:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:53.212688 | orchestrator | 2026-04-20 04:12:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:56.264850 | orchestrator | 2026-04-20 04:12:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:56.267230 | orchestrator | 2026-04-20 04:12:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:56.267325 | orchestrator | 2026-04-20 04:12:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:12:59.316980 | orchestrator | 2026-04-20 04:12:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:12:59.318765 | orchestrator | 2026-04-20 04:12:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:12:59.318843 | orchestrator | 2026-04-20 04:12:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:02.369641 | orchestrator | 2026-04-20 04:13:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:02.371045 | orchestrator | 2026-04-20 04:13:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:02.371118 | orchestrator | 2026-04-20 04:13:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:05.423892 | orchestrator | 2026-04-20 04:13:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:05.424727 | orchestrator | 2026-04-20 04:13:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:05.424838 | orchestrator | 2026-04-20 04:13:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:08.474200 | orchestrator | 2026-04-20 04:13:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:08.476100 | orchestrator | 2026-04-20 04:13:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:08.476177 | orchestrator | 2026-04-20 04:13:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:11.527007 | orchestrator | 2026-04-20 04:13:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:11.528709 | orchestrator | 2026-04-20 04:13:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:11.528780 | orchestrator | 2026-04-20 04:13:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:14.576593 | orchestrator | 2026-04-20 04:13:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:14.579047 | orchestrator | 2026-04-20 04:13:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:14.579092 | orchestrator | 2026-04-20 04:13:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:17.634632 | orchestrator | 2026-04-20 04:13:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:17.636999 | orchestrator | 2026-04-20 04:13:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:17.637073 | orchestrator | 2026-04-20 04:13:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:20.690909 | orchestrator | 2026-04-20 04:13:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:20.693271 | orchestrator | 2026-04-20 04:13:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:20.693310 | orchestrator | 2026-04-20 04:13:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:23.740431 | orchestrator | 2026-04-20 04:13:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:23.741592 | orchestrator | 2026-04-20 04:13:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:23.741630 | orchestrator | 2026-04-20 04:13:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:26.785310 | orchestrator | 2026-04-20 04:13:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:26.787947 | orchestrator | 2026-04-20 04:13:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:26.788114 | orchestrator | 2026-04-20 04:13:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:29.839181 | orchestrator | 2026-04-20 04:13:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:29.840335 | orchestrator | 2026-04-20 04:13:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:29.840425 | orchestrator | 2026-04-20 04:13:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:32.883629 | orchestrator | 2026-04-20 04:13:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:32.883964 | orchestrator | 2026-04-20 04:13:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:32.883992 | orchestrator | 2026-04-20 04:13:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:35.929923 | orchestrator | 2026-04-20 04:13:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:35.930512 | orchestrator | 2026-04-20 04:13:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:35.930539 | orchestrator | 2026-04-20 04:13:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:38.979802 | orchestrator | 2026-04-20 04:13:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:38.981546 | orchestrator | 2026-04-20 04:13:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:38.981684 | orchestrator | 2026-04-20 04:13:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:42.034800 | orchestrator | 2026-04-20 04:13:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:42.035734 | orchestrator | 2026-04-20 04:13:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:42.035770 | orchestrator | 2026-04-20 04:13:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:45.083051 | orchestrator | 2026-04-20 04:13:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:45.084481 | orchestrator | 2026-04-20 04:13:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:45.084646 | orchestrator | 2026-04-20 04:13:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:48.129004 | orchestrator | 2026-04-20 04:13:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:48.131052 | orchestrator | 2026-04-20 04:13:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:48.131109 | orchestrator | 2026-04-20 04:13:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:51.181535 | orchestrator | 2026-04-20 04:13:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:51.183745 | orchestrator | 2026-04-20 04:13:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:51.183920 | orchestrator | 2026-04-20 04:13:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:54.230701 | orchestrator | 2026-04-20 04:13:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:54.233747 | orchestrator | 2026-04-20 04:13:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:54.233901 | orchestrator | 2026-04-20 04:13:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:13:57.279383 | orchestrator | 2026-04-20 04:13:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:13:57.281218 | orchestrator | 2026-04-20 04:13:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:13:57.281284 | orchestrator | 2026-04-20 04:13:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:00.331108 | orchestrator | 2026-04-20 04:14:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:00.333194 | orchestrator | 2026-04-20 04:14:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:00.333251 | orchestrator | 2026-04-20 04:14:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:03.379203 | orchestrator | 2026-04-20 04:14:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:03.381946 | orchestrator | 2026-04-20 04:14:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:03.381998 | orchestrator | 2026-04-20 04:14:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:06.431235 | orchestrator | 2026-04-20 04:14:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:06.434372 | orchestrator | 2026-04-20 04:14:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:06.434522 | orchestrator | 2026-04-20 04:14:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:09.484020 | orchestrator | 2026-04-20 04:14:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:09.486110 | orchestrator | 2026-04-20 04:14:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:09.486179 | orchestrator | 2026-04-20 04:14:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:12.537801 | orchestrator | 2026-04-20 04:14:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:12.538426 | orchestrator | 2026-04-20 04:14:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:12.538463 | orchestrator | 2026-04-20 04:14:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:15.588319 | orchestrator | 2026-04-20 04:14:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:15.589999 | orchestrator | 2026-04-20 04:14:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:15.590075 | orchestrator | 2026-04-20 04:14:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:18.641092 | orchestrator | 2026-04-20 04:14:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:18.643216 | orchestrator | 2026-04-20 04:14:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:18.643321 | orchestrator | 2026-04-20 04:14:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:21.690424 | orchestrator | 2026-04-20 04:14:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:21.692326 | orchestrator | 2026-04-20 04:14:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:21.692383 | orchestrator | 2026-04-20 04:14:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:24.739877 | orchestrator | 2026-04-20 04:14:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:24.741668 | orchestrator | 2026-04-20 04:14:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:24.741717 | orchestrator | 2026-04-20 04:14:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:27.790535 | orchestrator | 2026-04-20 04:14:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:27.792604 | orchestrator | 2026-04-20 04:14:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:27.792683 | orchestrator | 2026-04-20 04:14:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:30.838949 | orchestrator | 2026-04-20 04:14:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:30.840533 | orchestrator | 2026-04-20 04:14:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:30.840615 | orchestrator | 2026-04-20 04:14:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:33.890142 | orchestrator | 2026-04-20 04:14:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:33.892249 | orchestrator | 2026-04-20 04:14:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:33.892291 | orchestrator | 2026-04-20 04:14:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:36.942003 | orchestrator | 2026-04-20 04:14:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:36.943109 | orchestrator | 2026-04-20 04:14:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:36.943158 | orchestrator | 2026-04-20 04:14:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:39.988749 | orchestrator | 2026-04-20 04:14:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:39.990353 | orchestrator | 2026-04-20 04:14:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:39.990412 | orchestrator | 2026-04-20 04:14:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:43.046007 | orchestrator | 2026-04-20 04:14:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:43.049547 | orchestrator | 2026-04-20 04:14:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:43.049923 | orchestrator | 2026-04-20 04:14:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:46.099978 | orchestrator | 2026-04-20 04:14:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:46.102180 | orchestrator | 2026-04-20 04:14:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:46.102259 | orchestrator | 2026-04-20 04:14:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:49.152129 | orchestrator | 2026-04-20 04:14:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:49.153466 | orchestrator | 2026-04-20 04:14:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:49.153519 | orchestrator | 2026-04-20 04:14:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:52.206363 | orchestrator | 2026-04-20 04:14:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:52.208091 | orchestrator | 2026-04-20 04:14:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:52.208143 | orchestrator | 2026-04-20 04:14:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:55.256715 | orchestrator | 2026-04-20 04:14:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:55.257778 | orchestrator | 2026-04-20 04:14:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:55.257970 | orchestrator | 2026-04-20 04:14:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:14:58.306987 | orchestrator | 2026-04-20 04:14:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:14:58.307134 | orchestrator | 2026-04-20 04:14:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:14:58.307150 | orchestrator | 2026-04-20 04:14:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:01.350365 | orchestrator | 2026-04-20 04:15:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:01.352343 | orchestrator | 2026-04-20 04:15:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:01.352417 | orchestrator | 2026-04-20 04:15:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:04.400567 | orchestrator | 2026-04-20 04:15:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:04.402347 | orchestrator | 2026-04-20 04:15:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:04.402404 | orchestrator | 2026-04-20 04:15:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:07.455451 | orchestrator | 2026-04-20 04:15:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:07.457802 | orchestrator | 2026-04-20 04:15:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:07.457902 | orchestrator | 2026-04-20 04:15:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:10.502754 | orchestrator | 2026-04-20 04:15:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:10.503059 | orchestrator | 2026-04-20 04:15:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:10.503218 | orchestrator | 2026-04-20 04:15:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:13.558791 | orchestrator | 2026-04-20 04:15:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:13.560411 | orchestrator | 2026-04-20 04:15:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:13.560462 | orchestrator | 2026-04-20 04:15:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:16.611231 | orchestrator | 2026-04-20 04:15:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:16.612541 | orchestrator | 2026-04-20 04:15:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:16.612597 | orchestrator | 2026-04-20 04:15:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:19.655819 | orchestrator | 2026-04-20 04:15:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:19.658119 | orchestrator | 2026-04-20 04:15:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:19.658186 | orchestrator | 2026-04-20 04:15:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:22.695402 | orchestrator | 2026-04-20 04:15:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:22.697353 | orchestrator | 2026-04-20 04:15:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:22.697436 | orchestrator | 2026-04-20 04:15:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:25.746422 | orchestrator | 2026-04-20 04:15:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:25.748983 | orchestrator | 2026-04-20 04:15:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:25.749014 | orchestrator | 2026-04-20 04:15:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:28.804943 | orchestrator | 2026-04-20 04:15:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:28.807431 | orchestrator | 2026-04-20 04:15:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:28.807562 | orchestrator | 2026-04-20 04:15:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:31.850488 | orchestrator | 2026-04-20 04:15:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:31.852004 | orchestrator | 2026-04-20 04:15:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:31.852398 | orchestrator | 2026-04-20 04:15:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:34.894092 | orchestrator | 2026-04-20 04:15:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:34.896017 | orchestrator | 2026-04-20 04:15:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:34.896074 | orchestrator | 2026-04-20 04:15:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:37.945852 | orchestrator | 2026-04-20 04:15:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:37.947479 | orchestrator | 2026-04-20 04:15:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:37.947817 | orchestrator | 2026-04-20 04:15:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:40.997868 | orchestrator | 2026-04-20 04:15:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:40.999242 | orchestrator | 2026-04-20 04:15:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:40.999296 | orchestrator | 2026-04-20 04:15:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:44.047175 | orchestrator | 2026-04-20 04:15:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:44.047259 | orchestrator | 2026-04-20 04:15:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:44.047268 | orchestrator | 2026-04-20 04:15:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:47.090409 | orchestrator | 2026-04-20 04:15:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:47.091935 | orchestrator | 2026-04-20 04:15:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:47.092127 | orchestrator | 2026-04-20 04:15:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:50.127131 | orchestrator | 2026-04-20 04:15:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:50.130075 | orchestrator | 2026-04-20 04:15:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:50.130186 | orchestrator | 2026-04-20 04:15:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:53.172233 | orchestrator | 2026-04-20 04:15:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:53.173913 | orchestrator | 2026-04-20 04:15:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:53.173936 | orchestrator | 2026-04-20 04:15:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:56.217808 | orchestrator | 2026-04-20 04:15:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:56.220400 | orchestrator | 2026-04-20 04:15:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:56.220519 | orchestrator | 2026-04-20 04:15:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:15:59.271105 | orchestrator | 2026-04-20 04:15:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:15:59.273118 | orchestrator | 2026-04-20 04:15:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:15:59.273173 | orchestrator | 2026-04-20 04:15:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:02.326293 | orchestrator | 2026-04-20 04:16:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:02.328211 | orchestrator | 2026-04-20 04:16:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:02.328260 | orchestrator | 2026-04-20 04:16:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:05.378349 | orchestrator | 2026-04-20 04:16:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:05.382136 | orchestrator | 2026-04-20 04:16:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:05.382282 | orchestrator | 2026-04-20 04:16:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:08.429338 | orchestrator | 2026-04-20 04:16:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:08.431538 | orchestrator | 2026-04-20 04:16:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:08.431703 | orchestrator | 2026-04-20 04:16:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:11.477641 | orchestrator | 2026-04-20 04:16:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:11.479312 | orchestrator | 2026-04-20 04:16:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:11.479593 | orchestrator | 2026-04-20 04:16:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:14.523481 | orchestrator | 2026-04-20 04:16:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:14.525100 | orchestrator | 2026-04-20 04:16:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:14.525571 | orchestrator | 2026-04-20 04:16:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:17.574799 | orchestrator | 2026-04-20 04:16:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:17.576882 | orchestrator | 2026-04-20 04:16:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:17.576972 | orchestrator | 2026-04-20 04:16:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:20.620206 | orchestrator | 2026-04-20 04:16:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:20.622111 | orchestrator | 2026-04-20 04:16:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:20.622170 | orchestrator | 2026-04-20 04:16:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:23.675187 | orchestrator | 2026-04-20 04:16:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:23.677279 | orchestrator | 2026-04-20 04:16:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:23.677399 | orchestrator | 2026-04-20 04:16:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:26.733102 | orchestrator | 2026-04-20 04:16:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:26.734469 | orchestrator | 2026-04-20 04:16:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:26.734503 | orchestrator | 2026-04-20 04:16:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:29.775200 | orchestrator | 2026-04-20 04:16:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:29.778271 | orchestrator | 2026-04-20 04:16:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:29.778405 | orchestrator | 2026-04-20 04:16:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:32.825831 | orchestrator | 2026-04-20 04:16:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:32.827788 | orchestrator | 2026-04-20 04:16:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:32.828038 | orchestrator | 2026-04-20 04:16:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:35.876564 | orchestrator | 2026-04-20 04:16:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:35.878256 | orchestrator | 2026-04-20 04:16:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:35.878571 | orchestrator | 2026-04-20 04:16:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:38.932174 | orchestrator | 2026-04-20 04:16:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:38.933551 | orchestrator | 2026-04-20 04:16:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:38.933585 | orchestrator | 2026-04-20 04:16:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:41.982785 | orchestrator | 2026-04-20 04:16:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:41.985050 | orchestrator | 2026-04-20 04:16:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:41.985091 | orchestrator | 2026-04-20 04:16:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:45.036698 | orchestrator | 2026-04-20 04:16:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:45.039615 | orchestrator | 2026-04-20 04:16:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:45.039683 | orchestrator | 2026-04-20 04:16:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:48.088463 | orchestrator | 2026-04-20 04:16:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:48.090330 | orchestrator | 2026-04-20 04:16:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:48.090396 | orchestrator | 2026-04-20 04:16:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:51.134082 | orchestrator | 2026-04-20 04:16:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:51.137036 | orchestrator | 2026-04-20 04:16:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:51.137097 | orchestrator | 2026-04-20 04:16:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:54.185985 | orchestrator | 2026-04-20 04:16:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:54.188383 | orchestrator | 2026-04-20 04:16:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:54.188461 | orchestrator | 2026-04-20 04:16:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:16:57.236446 | orchestrator | 2026-04-20 04:16:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:16:57.237366 | orchestrator | 2026-04-20 04:16:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:16:57.237541 | orchestrator | 2026-04-20 04:16:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:00.285321 | orchestrator | 2026-04-20 04:17:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:00.286873 | orchestrator | 2026-04-20 04:17:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:00.286947 | orchestrator | 2026-04-20 04:17:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:03.329317 | orchestrator | 2026-04-20 04:17:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:03.330007 | orchestrator | 2026-04-20 04:17:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:03.330179 | orchestrator | 2026-04-20 04:17:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:06.379147 | orchestrator | 2026-04-20 04:17:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:06.380507 | orchestrator | 2026-04-20 04:17:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:06.380726 | orchestrator | 2026-04-20 04:17:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:09.429634 | orchestrator | 2026-04-20 04:17:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:09.431879 | orchestrator | 2026-04-20 04:17:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:09.432127 | orchestrator | 2026-04-20 04:17:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:12.480690 | orchestrator | 2026-04-20 04:17:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:12.481359 | orchestrator | 2026-04-20 04:17:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:12.481415 | orchestrator | 2026-04-20 04:17:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:15.529628 | orchestrator | 2026-04-20 04:17:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:15.530763 | orchestrator | 2026-04-20 04:17:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:15.530805 | orchestrator | 2026-04-20 04:17:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:18.579872 | orchestrator | 2026-04-20 04:17:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:18.582676 | orchestrator | 2026-04-20 04:17:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:18.582741 | orchestrator | 2026-04-20 04:17:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:21.624842 | orchestrator | 2026-04-20 04:17:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:21.626483 | orchestrator | 2026-04-20 04:17:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:21.626568 | orchestrator | 2026-04-20 04:17:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:24.675810 | orchestrator | 2026-04-20 04:17:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:24.677425 | orchestrator | 2026-04-20 04:17:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:24.677471 | orchestrator | 2026-04-20 04:17:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:27.727510 | orchestrator | 2026-04-20 04:17:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:27.729332 | orchestrator | 2026-04-20 04:17:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:27.729397 | orchestrator | 2026-04-20 04:17:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:30.774581 | orchestrator | 2026-04-20 04:17:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:30.776253 | orchestrator | 2026-04-20 04:17:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:30.776314 | orchestrator | 2026-04-20 04:17:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:33.820614 | orchestrator | 2026-04-20 04:17:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:33.824252 | orchestrator | 2026-04-20 04:17:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:33.824325 | orchestrator | 2026-04-20 04:17:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:36.876402 | orchestrator | 2026-04-20 04:17:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:36.877610 | orchestrator | 2026-04-20 04:17:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:36.877658 | orchestrator | 2026-04-20 04:17:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:39.927335 | orchestrator | 2026-04-20 04:17:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:39.928819 | orchestrator | 2026-04-20 04:17:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:39.928882 | orchestrator | 2026-04-20 04:17:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:42.981999 | orchestrator | 2026-04-20 04:17:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:42.984480 | orchestrator | 2026-04-20 04:17:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:42.984548 | orchestrator | 2026-04-20 04:17:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:46.036401 | orchestrator | 2026-04-20 04:17:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:46.038105 | orchestrator | 2026-04-20 04:17:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:46.038152 | orchestrator | 2026-04-20 04:17:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:49.084320 | orchestrator | 2026-04-20 04:17:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:49.086518 | orchestrator | 2026-04-20 04:17:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:49.086630 | orchestrator | 2026-04-20 04:17:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:52.132732 | orchestrator | 2026-04-20 04:17:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:52.133663 | orchestrator | 2026-04-20 04:17:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:52.133735 | orchestrator | 2026-04-20 04:17:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:55.184248 | orchestrator | 2026-04-20 04:17:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:55.185911 | orchestrator | 2026-04-20 04:17:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:55.186150 | orchestrator | 2026-04-20 04:17:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:17:58.231249 | orchestrator | 2026-04-20 04:17:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:17:58.232760 | orchestrator | 2026-04-20 04:17:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:17:58.232807 | orchestrator | 2026-04-20 04:17:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:01.280767 | orchestrator | 2026-04-20 04:18:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:01.283162 | orchestrator | 2026-04-20 04:18:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:01.283218 | orchestrator | 2026-04-20 04:18:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:04.327448 | orchestrator | 2026-04-20 04:18:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:04.329469 | orchestrator | 2026-04-20 04:18:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:04.329719 | orchestrator | 2026-04-20 04:18:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:07.380053 | orchestrator | 2026-04-20 04:18:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:07.381784 | orchestrator | 2026-04-20 04:18:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:07.381903 | orchestrator | 2026-04-20 04:18:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:10.430845 | orchestrator | 2026-04-20 04:18:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:10.432590 | orchestrator | 2026-04-20 04:18:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:10.432648 | orchestrator | 2026-04-20 04:18:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:13.477594 | orchestrator | 2026-04-20 04:18:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:13.479144 | orchestrator | 2026-04-20 04:18:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:13.479188 | orchestrator | 2026-04-20 04:18:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:16.529320 | orchestrator | 2026-04-20 04:18:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:16.530708 | orchestrator | 2026-04-20 04:18:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:16.530789 | orchestrator | 2026-04-20 04:18:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:19.576660 | orchestrator | 2026-04-20 04:18:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:19.577651 | orchestrator | 2026-04-20 04:18:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:19.577694 | orchestrator | 2026-04-20 04:18:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:22.621845 | orchestrator | 2026-04-20 04:18:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:22.624262 | orchestrator | 2026-04-20 04:18:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:22.624355 | orchestrator | 2026-04-20 04:18:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:25.674664 | orchestrator | 2026-04-20 04:18:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:25.676799 | orchestrator | 2026-04-20 04:18:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:25.676889 | orchestrator | 2026-04-20 04:18:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:28.731323 | orchestrator | 2026-04-20 04:18:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:28.733058 | orchestrator | 2026-04-20 04:18:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:28.733136 | orchestrator | 2026-04-20 04:18:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:31.781868 | orchestrator | 2026-04-20 04:18:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:31.784915 | orchestrator | 2026-04-20 04:18:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:31.785064 | orchestrator | 2026-04-20 04:18:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:34.828466 | orchestrator | 2026-04-20 04:18:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:34.830731 | orchestrator | 2026-04-20 04:18:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:34.830794 | orchestrator | 2026-04-20 04:18:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:37.871813 | orchestrator | 2026-04-20 04:18:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:37.874221 | orchestrator | 2026-04-20 04:18:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:37.874282 | orchestrator | 2026-04-20 04:18:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:40.924386 | orchestrator | 2026-04-20 04:18:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:40.925584 | orchestrator | 2026-04-20 04:18:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:40.925711 | orchestrator | 2026-04-20 04:18:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:43.976434 | orchestrator | 2026-04-20 04:18:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:43.979125 | orchestrator | 2026-04-20 04:18:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:43.979491 | orchestrator | 2026-04-20 04:18:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:47.033240 | orchestrator | 2026-04-20 04:18:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:47.035416 | orchestrator | 2026-04-20 04:18:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:47.035469 | orchestrator | 2026-04-20 04:18:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:50.089308 | orchestrator | 2026-04-20 04:18:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:50.091993 | orchestrator | 2026-04-20 04:18:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:50.092051 | orchestrator | 2026-04-20 04:18:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:53.145290 | orchestrator | 2026-04-20 04:18:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:53.146567 | orchestrator | 2026-04-20 04:18:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:53.146640 | orchestrator | 2026-04-20 04:18:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:56.201198 | orchestrator | 2026-04-20 04:18:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:56.203436 | orchestrator | 2026-04-20 04:18:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:56.203490 | orchestrator | 2026-04-20 04:18:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:18:59.247796 | orchestrator | 2026-04-20 04:18:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:18:59.249199 | orchestrator | 2026-04-20 04:18:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:18:59.249264 | orchestrator | 2026-04-20 04:18:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:02.290825 | orchestrator | 2026-04-20 04:19:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:02.291431 | orchestrator | 2026-04-20 04:19:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:02.291514 | orchestrator | 2026-04-20 04:19:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:05.340638 | orchestrator | 2026-04-20 04:19:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:05.343275 | orchestrator | 2026-04-20 04:19:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:05.343380 | orchestrator | 2026-04-20 04:19:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:08.386776 | orchestrator | 2026-04-20 04:19:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:08.388727 | orchestrator | 2026-04-20 04:19:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:08.388876 | orchestrator | 2026-04-20 04:19:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:11.444951 | orchestrator | 2026-04-20 04:19:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:11.447117 | orchestrator | 2026-04-20 04:19:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:11.447282 | orchestrator | 2026-04-20 04:19:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:14.490132 | orchestrator | 2026-04-20 04:19:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:14.490872 | orchestrator | 2026-04-20 04:19:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:14.491242 | orchestrator | 2026-04-20 04:19:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:17.544824 | orchestrator | 2026-04-20 04:19:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:17.546200 | orchestrator | 2026-04-20 04:19:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:17.546244 | orchestrator | 2026-04-20 04:19:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:20.592424 | orchestrator | 2026-04-20 04:19:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:20.594511 | orchestrator | 2026-04-20 04:19:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:20.594557 | orchestrator | 2026-04-20 04:19:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:23.642600 | orchestrator | 2026-04-20 04:19:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:23.645219 | orchestrator | 2026-04-20 04:19:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:23.645344 | orchestrator | 2026-04-20 04:19:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:26.692540 | orchestrator | 2026-04-20 04:19:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:26.695050 | orchestrator | 2026-04-20 04:19:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:26.695121 | orchestrator | 2026-04-20 04:19:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:29.742006 | orchestrator | 2026-04-20 04:19:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:29.744109 | orchestrator | 2026-04-20 04:19:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:29.744158 | orchestrator | 2026-04-20 04:19:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:32.790708 | orchestrator | 2026-04-20 04:19:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:32.792460 | orchestrator | 2026-04-20 04:19:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:32.792517 | orchestrator | 2026-04-20 04:19:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:35.836278 | orchestrator | 2026-04-20 04:19:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:35.838578 | orchestrator | 2026-04-20 04:19:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:35.838669 | orchestrator | 2026-04-20 04:19:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:38.886622 | orchestrator | 2026-04-20 04:19:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:38.888432 | orchestrator | 2026-04-20 04:19:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:38.888614 | orchestrator | 2026-04-20 04:19:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:41.938720 | orchestrator | 2026-04-20 04:19:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:41.940232 | orchestrator | 2026-04-20 04:19:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:41.940282 | orchestrator | 2026-04-20 04:19:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:44.993956 | orchestrator | 2026-04-20 04:19:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:44.995569 | orchestrator | 2026-04-20 04:19:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:44.995630 | orchestrator | 2026-04-20 04:19:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:48.049038 | orchestrator | 2026-04-20 04:19:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:48.050161 | orchestrator | 2026-04-20 04:19:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:48.050271 | orchestrator | 2026-04-20 04:19:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:51.093597 | orchestrator | 2026-04-20 04:19:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:51.094211 | orchestrator | 2026-04-20 04:19:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:51.094272 | orchestrator | 2026-04-20 04:19:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:54.144191 | orchestrator | 2026-04-20 04:19:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:54.146060 | orchestrator | 2026-04-20 04:19:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:54.146100 | orchestrator | 2026-04-20 04:19:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:19:57.192424 | orchestrator | 2026-04-20 04:19:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:19:57.194925 | orchestrator | 2026-04-20 04:19:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:19:57.195016 | orchestrator | 2026-04-20 04:19:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:00.235730 | orchestrator | 2026-04-20 04:20:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:00.237282 | orchestrator | 2026-04-20 04:20:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:00.237346 | orchestrator | 2026-04-20 04:20:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:03.284076 | orchestrator | 2026-04-20 04:20:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:03.286223 | orchestrator | 2026-04-20 04:20:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:03.286283 | orchestrator | 2026-04-20 04:20:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:06.330691 | orchestrator | 2026-04-20 04:20:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:06.333285 | orchestrator | 2026-04-20 04:20:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:06.333340 | orchestrator | 2026-04-20 04:20:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:09.379260 | orchestrator | 2026-04-20 04:20:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:09.380774 | orchestrator | 2026-04-20 04:20:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:09.380859 | orchestrator | 2026-04-20 04:20:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:12.427662 | orchestrator | 2026-04-20 04:20:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:12.429855 | orchestrator | 2026-04-20 04:20:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:12.430178 | orchestrator | 2026-04-20 04:20:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:15.476319 | orchestrator | 2026-04-20 04:20:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:15.477905 | orchestrator | 2026-04-20 04:20:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:15.477967 | orchestrator | 2026-04-20 04:20:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:18.526227 | orchestrator | 2026-04-20 04:20:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:18.527551 | orchestrator | 2026-04-20 04:20:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:18.527602 | orchestrator | 2026-04-20 04:20:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:21.574318 | orchestrator | 2026-04-20 04:20:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:21.575109 | orchestrator | 2026-04-20 04:20:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:21.575164 | orchestrator | 2026-04-20 04:20:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:24.622104 | orchestrator | 2026-04-20 04:20:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:24.624691 | orchestrator | 2026-04-20 04:20:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:24.624729 | orchestrator | 2026-04-20 04:20:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:27.675401 | orchestrator | 2026-04-20 04:20:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:27.675603 | orchestrator | 2026-04-20 04:20:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:27.675625 | orchestrator | 2026-04-20 04:20:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:30.716562 | orchestrator | 2026-04-20 04:20:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:30.718700 | orchestrator | 2026-04-20 04:20:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:30.718755 | orchestrator | 2026-04-20 04:20:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:33.765808 | orchestrator | 2026-04-20 04:20:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:33.768059 | orchestrator | 2026-04-20 04:20:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:33.768147 | orchestrator | 2026-04-20 04:20:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:36.817582 | orchestrator | 2026-04-20 04:20:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:36.819136 | orchestrator | 2026-04-20 04:20:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:36.819210 | orchestrator | 2026-04-20 04:20:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:39.864432 | orchestrator | 2026-04-20 04:20:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:39.866259 | orchestrator | 2026-04-20 04:20:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:39.866317 | orchestrator | 2026-04-20 04:20:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:42.913802 | orchestrator | 2026-04-20 04:20:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:42.915589 | orchestrator | 2026-04-20 04:20:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:42.915637 | orchestrator | 2026-04-20 04:20:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:45.959464 | orchestrator | 2026-04-20 04:20:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:45.962217 | orchestrator | 2026-04-20 04:20:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:45.962281 | orchestrator | 2026-04-20 04:20:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:49.005854 | orchestrator | 2026-04-20 04:20:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:49.006503 | orchestrator | 2026-04-20 04:20:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:49.006548 | orchestrator | 2026-04-20 04:20:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:52.051661 | orchestrator | 2026-04-20 04:20:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:52.053348 | orchestrator | 2026-04-20 04:20:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:52.053392 | orchestrator | 2026-04-20 04:20:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:55.096167 | orchestrator | 2026-04-20 04:20:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:55.097163 | orchestrator | 2026-04-20 04:20:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:55.097215 | orchestrator | 2026-04-20 04:20:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:20:58.155773 | orchestrator | 2026-04-20 04:20:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:20:58.156711 | orchestrator | 2026-04-20 04:20:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:20:58.156786 | orchestrator | 2026-04-20 04:20:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:01.201952 | orchestrator | 2026-04-20 04:21:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:01.204482 | orchestrator | 2026-04-20 04:21:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:01.204514 | orchestrator | 2026-04-20 04:21:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:04.250400 | orchestrator | 2026-04-20 04:21:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:04.250603 | orchestrator | 2026-04-20 04:21:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:04.250620 | orchestrator | 2026-04-20 04:21:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:07.298523 | orchestrator | 2026-04-20 04:21:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:07.301224 | orchestrator | 2026-04-20 04:21:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:07.301293 | orchestrator | 2026-04-20 04:21:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:10.350626 | orchestrator | 2026-04-20 04:21:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:10.351485 | orchestrator | 2026-04-20 04:21:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:10.351531 | orchestrator | 2026-04-20 04:21:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:13.399813 | orchestrator | 2026-04-20 04:21:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:13.401900 | orchestrator | 2026-04-20 04:21:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:13.402001 | orchestrator | 2026-04-20 04:21:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:16.452837 | orchestrator | 2026-04-20 04:21:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:16.456371 | orchestrator | 2026-04-20 04:21:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:16.456460 | orchestrator | 2026-04-20 04:21:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:19.506224 | orchestrator | 2026-04-20 04:21:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:19.507604 | orchestrator | 2026-04-20 04:21:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:19.507679 | orchestrator | 2026-04-20 04:21:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:22.550409 | orchestrator | 2026-04-20 04:21:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:22.551452 | orchestrator | 2026-04-20 04:21:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:22.551587 | orchestrator | 2026-04-20 04:21:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:25.600461 | orchestrator | 2026-04-20 04:21:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:25.602783 | orchestrator | 2026-04-20 04:21:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:25.602854 | orchestrator | 2026-04-20 04:21:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:28.654710 | orchestrator | 2026-04-20 04:21:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:28.656507 | orchestrator | 2026-04-20 04:21:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:28.656722 | orchestrator | 2026-04-20 04:21:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:31.696830 | orchestrator | 2026-04-20 04:21:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:31.697728 | orchestrator | 2026-04-20 04:21:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:31.698135 | orchestrator | 2026-04-20 04:21:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:34.742443 | orchestrator | 2026-04-20 04:21:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:34.744021 | orchestrator | 2026-04-20 04:21:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:34.744174 | orchestrator | 2026-04-20 04:21:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:37.794875 | orchestrator | 2026-04-20 04:21:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:37.797153 | orchestrator | 2026-04-20 04:21:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:37.797230 | orchestrator | 2026-04-20 04:21:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:40.839765 | orchestrator | 2026-04-20 04:21:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:40.841128 | orchestrator | 2026-04-20 04:21:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:40.841190 | orchestrator | 2026-04-20 04:21:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:43.892901 | orchestrator | 2026-04-20 04:21:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:43.894622 | orchestrator | 2026-04-20 04:21:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:43.894773 | orchestrator | 2026-04-20 04:21:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:46.940979 | orchestrator | 2026-04-20 04:21:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:46.942993 | orchestrator | 2026-04-20 04:21:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:46.943127 | orchestrator | 2026-04-20 04:21:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:49.994875 | orchestrator | 2026-04-20 04:21:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:49.996598 | orchestrator | 2026-04-20 04:21:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:49.996664 | orchestrator | 2026-04-20 04:21:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:53.038252 | orchestrator | 2026-04-20 04:21:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:53.039996 | orchestrator | 2026-04-20 04:21:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:53.040100 | orchestrator | 2026-04-20 04:21:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:56.078641 | orchestrator | 2026-04-20 04:21:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:56.079833 | orchestrator | 2026-04-20 04:21:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:56.080004 | orchestrator | 2026-04-20 04:21:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:21:59.123391 | orchestrator | 2026-04-20 04:21:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:21:59.125083 | orchestrator | 2026-04-20 04:21:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:21:59.125121 | orchestrator | 2026-04-20 04:21:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:02.165482 | orchestrator | 2026-04-20 04:22:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:02.167402 | orchestrator | 2026-04-20 04:22:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:02.167456 | orchestrator | 2026-04-20 04:22:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:05.216239 | orchestrator | 2026-04-20 04:22:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:05.217724 | orchestrator | 2026-04-20 04:22:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:05.217751 | orchestrator | 2026-04-20 04:22:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:08.270658 | orchestrator | 2026-04-20 04:22:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:08.270745 | orchestrator | 2026-04-20 04:22:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:08.270752 | orchestrator | 2026-04-20 04:22:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:11.312256 | orchestrator | 2026-04-20 04:22:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:11.313584 | orchestrator | 2026-04-20 04:22:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:11.313624 | orchestrator | 2026-04-20 04:22:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:14.367871 | orchestrator | 2026-04-20 04:22:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:14.370159 | orchestrator | 2026-04-20 04:22:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:14.370406 | orchestrator | 2026-04-20 04:22:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:17.419689 | orchestrator | 2026-04-20 04:22:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:17.421972 | orchestrator | 2026-04-20 04:22:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:17.422583 | orchestrator | 2026-04-20 04:22:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:20.470466 | orchestrator | 2026-04-20 04:22:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:20.472806 | orchestrator | 2026-04-20 04:22:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:20.472873 | orchestrator | 2026-04-20 04:22:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:23.515691 | orchestrator | 2026-04-20 04:22:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:23.517039 | orchestrator | 2026-04-20 04:22:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:23.517239 | orchestrator | 2026-04-20 04:22:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:26.560929 | orchestrator | 2026-04-20 04:22:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:26.562379 | orchestrator | 2026-04-20 04:22:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:26.562427 | orchestrator | 2026-04-20 04:22:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:29.613191 | orchestrator | 2026-04-20 04:22:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:29.613922 | orchestrator | 2026-04-20 04:22:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:29.613956 | orchestrator | 2026-04-20 04:22:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:32.664896 | orchestrator | 2026-04-20 04:22:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:32.667551 | orchestrator | 2026-04-20 04:22:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:32.667599 | orchestrator | 2026-04-20 04:22:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:35.713650 | orchestrator | 2026-04-20 04:22:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:35.714989 | orchestrator | 2026-04-20 04:22:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:35.715032 | orchestrator | 2026-04-20 04:22:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:38.756340 | orchestrator | 2026-04-20 04:22:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:38.757851 | orchestrator | 2026-04-20 04:22:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:38.758126 | orchestrator | 2026-04-20 04:22:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:41.805810 | orchestrator | 2026-04-20 04:22:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:41.808420 | orchestrator | 2026-04-20 04:22:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:41.808494 | orchestrator | 2026-04-20 04:22:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:44.859300 | orchestrator | 2026-04-20 04:22:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:44.861242 | orchestrator | 2026-04-20 04:22:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:44.861322 | orchestrator | 2026-04-20 04:22:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:47.906302 | orchestrator | 2026-04-20 04:22:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:47.907311 | orchestrator | 2026-04-20 04:22:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:47.907538 | orchestrator | 2026-04-20 04:22:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:50.966242 | orchestrator | 2026-04-20 04:22:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:50.967910 | orchestrator | 2026-04-20 04:22:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:50.967981 | orchestrator | 2026-04-20 04:22:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:54.016395 | orchestrator | 2026-04-20 04:22:54 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:54.017903 | orchestrator | 2026-04-20 04:22:54 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:54.017973 | orchestrator | 2026-04-20 04:22:54 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:22:57.069208 | orchestrator | 2026-04-20 04:22:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:22:57.072230 | orchestrator | 2026-04-20 04:22:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:22:57.072299 | orchestrator | 2026-04-20 04:22:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:00.111777 | orchestrator | 2026-04-20 04:23:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:00.113476 | orchestrator | 2026-04-20 04:23:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:00.113590 | orchestrator | 2026-04-20 04:23:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:03.162174 | orchestrator | 2026-04-20 04:23:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:03.163440 | orchestrator | 2026-04-20 04:23:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:03.163527 | orchestrator | 2026-04-20 04:23:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:06.204811 | orchestrator | 2026-04-20 04:23:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:06.206711 | orchestrator | 2026-04-20 04:23:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:06.206756 | orchestrator | 2026-04-20 04:23:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:09.252979 | orchestrator | 2026-04-20 04:23:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:09.254201 | orchestrator | 2026-04-20 04:23:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:09.254256 | orchestrator | 2026-04-20 04:23:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:12.295653 | orchestrator | 2026-04-20 04:23:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:12.297634 | orchestrator | 2026-04-20 04:23:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:12.297667 | orchestrator | 2026-04-20 04:23:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:15.348640 | orchestrator | 2026-04-20 04:23:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:15.349945 | orchestrator | 2026-04-20 04:23:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:15.351727 | orchestrator | 2026-04-20 04:23:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:18.411536 | orchestrator | 2026-04-20 04:23:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:18.413402 | orchestrator | 2026-04-20 04:23:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:18.413452 | orchestrator | 2026-04-20 04:23:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:21.471483 | orchestrator | 2026-04-20 04:23:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:21.472166 | orchestrator | 2026-04-20 04:23:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:21.472461 | orchestrator | 2026-04-20 04:23:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:24.522674 | orchestrator | 2026-04-20 04:23:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:24.524280 | orchestrator | 2026-04-20 04:23:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:24.524469 | orchestrator | 2026-04-20 04:23:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:27.572216 | orchestrator | 2026-04-20 04:23:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:27.572455 | orchestrator | 2026-04-20 04:23:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:27.572479 | orchestrator | 2026-04-20 04:23:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:30.623141 | orchestrator | 2026-04-20 04:23:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:30.625446 | orchestrator | 2026-04-20 04:23:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:30.626219 | orchestrator | 2026-04-20 04:23:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:33.678113 | orchestrator | 2026-04-20 04:23:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:33.678332 | orchestrator | 2026-04-20 04:23:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:33.678352 | orchestrator | 2026-04-20 04:23:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:36.724958 | orchestrator | 2026-04-20 04:23:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:36.728677 | orchestrator | 2026-04-20 04:23:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:36.728772 | orchestrator | 2026-04-20 04:23:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:39.779494 | orchestrator | 2026-04-20 04:23:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:39.784200 | orchestrator | 2026-04-20 04:23:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:39.784364 | orchestrator | 2026-04-20 04:23:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:42.827415 | orchestrator | 2026-04-20 04:23:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:42.829962 | orchestrator | 2026-04-20 04:23:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:42.830235 | orchestrator | 2026-04-20 04:23:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:45.877441 | orchestrator | 2026-04-20 04:23:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:45.879268 | orchestrator | 2026-04-20 04:23:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:45.879328 | orchestrator | 2026-04-20 04:23:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:48.930327 | orchestrator | 2026-04-20 04:23:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:48.933372 | orchestrator | 2026-04-20 04:23:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:48.933461 | orchestrator | 2026-04-20 04:23:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:51.979878 | orchestrator | 2026-04-20 04:23:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:51.981942 | orchestrator | 2026-04-20 04:23:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:51.982159 | orchestrator | 2026-04-20 04:23:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:55.036165 | orchestrator | 2026-04-20 04:23:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:55.037945 | orchestrator | 2026-04-20 04:23:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:55.038012 | orchestrator | 2026-04-20 04:23:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:23:58.082745 | orchestrator | 2026-04-20 04:23:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:23:58.085011 | orchestrator | 2026-04-20 04:23:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:23:58.085073 | orchestrator | 2026-04-20 04:23:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:01.127390 | orchestrator | 2026-04-20 04:24:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:01.128327 | orchestrator | 2026-04-20 04:24:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:01.128480 | orchestrator | 2026-04-20 04:24:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:04.178536 | orchestrator | 2026-04-20 04:24:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:04.180607 | orchestrator | 2026-04-20 04:24:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:04.180662 | orchestrator | 2026-04-20 04:24:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:07.225690 | orchestrator | 2026-04-20 04:24:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:07.226122 | orchestrator | 2026-04-20 04:24:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:07.226154 | orchestrator | 2026-04-20 04:24:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:10.275499 | orchestrator | 2026-04-20 04:24:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:10.277180 | orchestrator | 2026-04-20 04:24:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:10.277259 | orchestrator | 2026-04-20 04:24:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:13.324164 | orchestrator | 2026-04-20 04:24:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:13.325868 | orchestrator | 2026-04-20 04:24:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:13.325908 | orchestrator | 2026-04-20 04:24:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:16.378843 | orchestrator | 2026-04-20 04:24:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:16.378936 | orchestrator | 2026-04-20 04:24:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:16.379028 | orchestrator | 2026-04-20 04:24:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:19.426982 | orchestrator | 2026-04-20 04:24:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:19.428932 | orchestrator | 2026-04-20 04:24:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:19.428993 | orchestrator | 2026-04-20 04:24:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:22.473340 | orchestrator | 2026-04-20 04:24:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:22.474669 | orchestrator | 2026-04-20 04:24:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:22.474742 | orchestrator | 2026-04-20 04:24:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:25.523469 | orchestrator | 2026-04-20 04:24:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:25.525430 | orchestrator | 2026-04-20 04:24:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:25.525487 | orchestrator | 2026-04-20 04:24:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:28.572772 | orchestrator | 2026-04-20 04:24:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:28.574921 | orchestrator | 2026-04-20 04:24:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:28.575028 | orchestrator | 2026-04-20 04:24:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:31.617174 | orchestrator | 2026-04-20 04:24:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:31.618270 | orchestrator | 2026-04-20 04:24:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:31.618317 | orchestrator | 2026-04-20 04:24:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:34.667216 | orchestrator | 2026-04-20 04:24:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:34.668835 | orchestrator | 2026-04-20 04:24:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:34.668907 | orchestrator | 2026-04-20 04:24:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:37.714451 | orchestrator | 2026-04-20 04:24:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:37.718406 | orchestrator | 2026-04-20 04:24:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:37.718501 | orchestrator | 2026-04-20 04:24:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:40.770664 | orchestrator | 2026-04-20 04:24:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:40.772394 | orchestrator | 2026-04-20 04:24:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:40.772494 | orchestrator | 2026-04-20 04:24:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:43.821947 | orchestrator | 2026-04-20 04:24:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:43.823844 | orchestrator | 2026-04-20 04:24:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:43.823906 | orchestrator | 2026-04-20 04:24:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:46.878470 | orchestrator | 2026-04-20 04:24:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:46.880352 | orchestrator | 2026-04-20 04:24:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:46.880429 | orchestrator | 2026-04-20 04:24:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:49.929334 | orchestrator | 2026-04-20 04:24:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:49.930228 | orchestrator | 2026-04-20 04:24:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:49.930285 | orchestrator | 2026-04-20 04:24:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:52.980876 | orchestrator | 2026-04-20 04:24:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:52.984280 | orchestrator | 2026-04-20 04:24:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:52.984442 | orchestrator | 2026-04-20 04:24:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:56.031986 | orchestrator | 2026-04-20 04:24:56 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:56.033971 | orchestrator | 2026-04-20 04:24:56 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:56.034180 | orchestrator | 2026-04-20 04:24:56 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:24:59.080495 | orchestrator | 2026-04-20 04:24:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:24:59.082545 | orchestrator | 2026-04-20 04:24:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:24:59.082589 | orchestrator | 2026-04-20 04:24:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:02.137310 | orchestrator | 2026-04-20 04:25:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:02.140192 | orchestrator | 2026-04-20 04:25:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:02.140622 | orchestrator | 2026-04-20 04:25:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:05.189307 | orchestrator | 2026-04-20 04:25:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:05.191191 | orchestrator | 2026-04-20 04:25:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:05.191241 | orchestrator | 2026-04-20 04:25:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:08.240386 | orchestrator | 2026-04-20 04:25:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:08.241911 | orchestrator | 2026-04-20 04:25:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:08.241964 | orchestrator | 2026-04-20 04:25:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:11.288205 | orchestrator | 2026-04-20 04:25:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:11.290201 | orchestrator | 2026-04-20 04:25:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:11.290269 | orchestrator | 2026-04-20 04:25:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:14.336485 | orchestrator | 2026-04-20 04:25:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:14.336897 | orchestrator | 2026-04-20 04:25:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:14.336926 | orchestrator | 2026-04-20 04:25:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:17.387922 | orchestrator | 2026-04-20 04:25:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:17.389541 | orchestrator | 2026-04-20 04:25:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:17.389580 | orchestrator | 2026-04-20 04:25:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:20.434236 | orchestrator | 2026-04-20 04:25:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:20.434389 | orchestrator | 2026-04-20 04:25:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:20.434404 | orchestrator | 2026-04-20 04:25:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:23.477247 | orchestrator | 2026-04-20 04:25:23 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:23.478619 | orchestrator | 2026-04-20 04:25:23 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:23.478668 | orchestrator | 2026-04-20 04:25:23 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:26.525674 | orchestrator | 2026-04-20 04:25:26 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:26.525907 | orchestrator | 2026-04-20 04:25:26 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:26.525926 | orchestrator | 2026-04-20 04:25:26 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:29.577167 | orchestrator | 2026-04-20 04:25:29 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:29.578725 | orchestrator | 2026-04-20 04:25:29 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:29.578779 | orchestrator | 2026-04-20 04:25:29 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:32.626641 | orchestrator | 2026-04-20 04:25:32 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:32.627375 | orchestrator | 2026-04-20 04:25:32 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:32.627398 | orchestrator | 2026-04-20 04:25:32 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:35.672408 | orchestrator | 2026-04-20 04:25:35 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:35.673952 | orchestrator | 2026-04-20 04:25:35 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:35.674194 | orchestrator | 2026-04-20 04:25:35 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:38.720467 | orchestrator | 2026-04-20 04:25:38 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:38.721606 | orchestrator | 2026-04-20 04:25:38 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:38.721750 | orchestrator | 2026-04-20 04:25:38 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:41.766372 | orchestrator | 2026-04-20 04:25:41 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:41.768944 | orchestrator | 2026-04-20 04:25:41 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:41.769055 | orchestrator | 2026-04-20 04:25:41 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:44.812039 | orchestrator | 2026-04-20 04:25:44 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:44.813865 | orchestrator | 2026-04-20 04:25:44 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:44.813950 | orchestrator | 2026-04-20 04:25:44 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:47.862885 | orchestrator | 2026-04-20 04:25:47 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:47.864517 | orchestrator | 2026-04-20 04:25:47 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:47.864553 | orchestrator | 2026-04-20 04:25:47 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:50.919836 | orchestrator | 2026-04-20 04:25:50 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:50.922345 | orchestrator | 2026-04-20 04:25:50 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:50.922409 | orchestrator | 2026-04-20 04:25:50 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:53.974554 | orchestrator | 2026-04-20 04:25:53 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:53.976020 | orchestrator | 2026-04-20 04:25:53 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:53.976083 | orchestrator | 2026-04-20 04:25:53 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:25:57.027319 | orchestrator | 2026-04-20 04:25:57 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:25:57.029406 | orchestrator | 2026-04-20 04:25:57 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:25:57.029488 | orchestrator | 2026-04-20 04:25:57 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:26:00.071497 | orchestrator | 2026-04-20 04:26:00 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:26:00.073120 | orchestrator | 2026-04-20 04:26:00 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:26:00.073369 | orchestrator | 2026-04-20 04:26:00 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:26:03.125528 | orchestrator | 2026-04-20 04:26:03 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:26:03.127720 | orchestrator | 2026-04-20 04:26:03 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:26:03.127833 | orchestrator | 2026-04-20 04:26:03 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:26:06.168972 | orchestrator | 2026-04-20 04:26:06 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:26:06.170997 | orchestrator | 2026-04-20 04:26:06 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:26:06.171047 | orchestrator | 2026-04-20 04:26:06 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:26:09.217241 | orchestrator | 2026-04-20 04:26:09 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:26:09.219812 | orchestrator | 2026-04-20 04:26:09 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:26:09.220477 | orchestrator | 2026-04-20 04:26:09 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:26:12.267254 | orchestrator | 2026-04-20 04:26:12 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:26:12.270666 | orchestrator | 2026-04-20 04:26:12 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:26:12.270717 | orchestrator | 2026-04-20 04:26:12 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:26:15.317895 | orchestrator | 2026-04-20 04:26:15 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:26:15.318803 | orchestrator | 2026-04-20 04:26:15 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:26:15.319035 | orchestrator | 2026-04-20 04:26:15 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:18.471898 | orchestrator | 2026-04-20 04:28:18 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:18.472216 | orchestrator | 2026-04-20 04:28:18 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:18.472253 | orchestrator | 2026-04-20 04:28:18 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:21.516481 | orchestrator | 2026-04-20 04:28:21 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:21.519245 | orchestrator | 2026-04-20 04:28:21 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:21.519338 | orchestrator | 2026-04-20 04:28:21 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:24.564372 | orchestrator | 2026-04-20 04:28:24 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:24.566591 | orchestrator | 2026-04-20 04:28:24 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:24.566644 | orchestrator | 2026-04-20 04:28:24 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:27.614190 | orchestrator | 2026-04-20 04:28:27 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:27.615442 | orchestrator | 2026-04-20 04:28:27 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:27.615508 | orchestrator | 2026-04-20 04:28:27 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:30.654724 | orchestrator | 2026-04-20 04:28:30 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:30.655390 | orchestrator | 2026-04-20 04:28:30 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:30.655412 | orchestrator | 2026-04-20 04:28:30 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:33.703328 | orchestrator | 2026-04-20 04:28:33 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:33.707383 | orchestrator | 2026-04-20 04:28:33 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:33.707465 | orchestrator | 2026-04-20 04:28:33 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:36.756446 | orchestrator | 2026-04-20 04:28:36 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:36.758386 | orchestrator | 2026-04-20 04:28:36 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:36.758461 | orchestrator | 2026-04-20 04:28:36 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:39.803545 | orchestrator | 2026-04-20 04:28:39 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:39.806182 | orchestrator | 2026-04-20 04:28:39 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:39.806333 | orchestrator | 2026-04-20 04:28:39 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:42.853605 | orchestrator | 2026-04-20 04:28:42 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:42.857470 | orchestrator | 2026-04-20 04:28:42 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:42.857552 | orchestrator | 2026-04-20 04:28:42 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:45.902488 | orchestrator | 2026-04-20 04:28:45 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:45.904468 | orchestrator | 2026-04-20 04:28:45 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:45.904566 | orchestrator | 2026-04-20 04:28:45 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:48.946791 | orchestrator | 2026-04-20 04:28:48 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:48.948864 | orchestrator | 2026-04-20 04:28:48 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:48.948970 | orchestrator | 2026-04-20 04:28:48 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:51.994394 | orchestrator | 2026-04-20 04:28:51 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:51.996131 | orchestrator | 2026-04-20 04:28:51 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:51.996258 | orchestrator | 2026-04-20 04:28:51 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:55.041662 | orchestrator | 2026-04-20 04:28:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:55.043937 | orchestrator | 2026-04-20 04:28:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:55.043987 | orchestrator | 2026-04-20 04:28:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:28:58.089451 | orchestrator | 2026-04-20 04:28:58 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:28:58.090889 | orchestrator | 2026-04-20 04:28:58 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:28:58.090960 | orchestrator | 2026-04-20 04:28:58 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:01.128366 | orchestrator | 2026-04-20 04:29:01 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:01.130496 | orchestrator | 2026-04-20 04:29:01 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:01.130571 | orchestrator | 2026-04-20 04:29:01 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:04.179709 | orchestrator | 2026-04-20 04:29:04 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:04.181763 | orchestrator | 2026-04-20 04:29:04 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:04.181843 | orchestrator | 2026-04-20 04:29:04 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:07.227796 | orchestrator | 2026-04-20 04:29:07 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:07.230283 | orchestrator | 2026-04-20 04:29:07 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:07.230352 | orchestrator | 2026-04-20 04:29:07 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:10.274642 | orchestrator | 2026-04-20 04:29:10 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:10.275956 | orchestrator | 2026-04-20 04:29:10 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:10.276142 | orchestrator | 2026-04-20 04:29:10 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:13.318314 | orchestrator | 2026-04-20 04:29:13 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:13.319846 | orchestrator | 2026-04-20 04:29:13 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:13.319951 | orchestrator | 2026-04-20 04:29:13 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:16.355681 | orchestrator | 2026-04-20 04:29:16 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:16.356308 | orchestrator | 2026-04-20 04:29:16 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:16.356339 | orchestrator | 2026-04-20 04:29:16 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:19.394007 | orchestrator | 2026-04-20 04:29:19 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:19.396175 | orchestrator | 2026-04-20 04:29:19 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:19.396227 | orchestrator | 2026-04-20 04:29:19 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:22.443765 | orchestrator | 2026-04-20 04:29:22 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:22.445816 | orchestrator | 2026-04-20 04:29:22 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:22.445864 | orchestrator | 2026-04-20 04:29:22 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:25.484804 | orchestrator | 2026-04-20 04:29:25 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:25.487432 | orchestrator | 2026-04-20 04:29:25 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:25.487514 | orchestrator | 2026-04-20 04:29:25 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:28.537022 | orchestrator | 2026-04-20 04:29:28 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:28.539131 | orchestrator | 2026-04-20 04:29:28 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:28.539230 | orchestrator | 2026-04-20 04:29:28 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:31.588670 | orchestrator | 2026-04-20 04:29:31 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:31.591040 | orchestrator | 2026-04-20 04:29:31 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:31.591118 | orchestrator | 2026-04-20 04:29:31 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:34.637320 | orchestrator | 2026-04-20 04:29:34 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:34.639551 | orchestrator | 2026-04-20 04:29:34 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:34.639605 | orchestrator | 2026-04-20 04:29:34 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:37.683983 | orchestrator | 2026-04-20 04:29:37 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:37.686086 | orchestrator | 2026-04-20 04:29:37 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:37.686530 | orchestrator | 2026-04-20 04:29:37 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:40.736645 | orchestrator | 2026-04-20 04:29:40 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:40.738856 | orchestrator | 2026-04-20 04:29:40 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:40.738895 | orchestrator | 2026-04-20 04:29:40 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:43.786492 | orchestrator | 2026-04-20 04:29:43 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:43.789152 | orchestrator | 2026-04-20 04:29:43 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:43.789332 | orchestrator | 2026-04-20 04:29:43 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:46.838715 | orchestrator | 2026-04-20 04:29:46 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:46.840344 | orchestrator | 2026-04-20 04:29:46 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:46.840422 | orchestrator | 2026-04-20 04:29:46 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:49.893402 | orchestrator | 2026-04-20 04:29:49 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:49.895523 | orchestrator | 2026-04-20 04:29:49 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:49.895615 | orchestrator | 2026-04-20 04:29:49 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:52.937029 | orchestrator | 2026-04-20 04:29:52 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:52.938618 | orchestrator | 2026-04-20 04:29:52 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:52.938664 | orchestrator | 2026-04-20 04:29:52 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:55.978929 | orchestrator | 2026-04-20 04:29:55 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:55.980343 | orchestrator | 2026-04-20 04:29:55 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:55.980508 | orchestrator | 2026-04-20 04:29:55 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:29:59.019717 | orchestrator | 2026-04-20 04:29:59 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:29:59.020571 | orchestrator | 2026-04-20 04:29:59 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:29:59.020652 | orchestrator | 2026-04-20 04:29:59 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:02.064520 | orchestrator | 2026-04-20 04:30:02 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:02.066560 | orchestrator | 2026-04-20 04:30:02 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:02.066618 | orchestrator | 2026-04-20 04:30:02 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:05.113459 | orchestrator | 2026-04-20 04:30:05 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:05.113647 | orchestrator | 2026-04-20 04:30:05 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:05.113670 | orchestrator | 2026-04-20 04:30:05 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:08.160656 | orchestrator | 2026-04-20 04:30:08 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:08.160762 | orchestrator | 2026-04-20 04:30:08 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:08.160874 | orchestrator | 2026-04-20 04:30:08 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:11.201812 | orchestrator | 2026-04-20 04:30:11 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:11.203415 | orchestrator | 2026-04-20 04:30:11 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:11.203584 | orchestrator | 2026-04-20 04:30:11 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:14.251673 | orchestrator | 2026-04-20 04:30:14 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:14.252289 | orchestrator | 2026-04-20 04:30:14 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:14.252312 | orchestrator | 2026-04-20 04:30:14 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:17.299035 | orchestrator | 2026-04-20 04:30:17 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:17.302147 | orchestrator | 2026-04-20 04:30:17 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:17.302488 | orchestrator | 2026-04-20 04:30:17 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:20.349400 | orchestrator | 2026-04-20 04:30:20 | INFO  | Task 9b2c58ff-ed39-423f-a7ff-d0cb25eaa22e is in state STARTED 2026-04-20 04:30:20.350895 | orchestrator | 2026-04-20 04:30:20 | INFO  | Task 16b76ba9-8ffd-4b1a-bd82-45b90ed48c9c is in state STARTED 2026-04-20 04:30:20.350942 | orchestrator | 2026-04-20 04:30:20 | INFO  | Wait 1 second(s) until the next check 2026-04-20 04:30:20.866689 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2026-04-20 04:30:20.871444 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2026-04-20 04:30:21.654772 | 2026-04-20 04:30:21.655001 | PLAY [Post output play] 2026-04-20 04:30:21.675125 | 2026-04-20 04:30:21.675429 | LOOP [stage-output : Register sources] 2026-04-20 04:30:21.752504 | 2026-04-20 04:30:21.753061 | TASK [stage-output : Check sudo] 2026-04-20 04:30:22.642376 | orchestrator | sudo: a password is required 2026-04-20 04:30:22.972092 | orchestrator | ok: Runtime: 0:00:00.017607 2026-04-20 04:30:22.987713 | 2026-04-20 04:30:22.987893 | LOOP [stage-output : Set source and destination for files and folders] 2026-04-20 04:30:23.028967 | 2026-04-20 04:30:23.029265 | TASK [stage-output : Build a list of source, dest dictionaries] 2026-04-20 04:30:23.108971 | orchestrator | ok 2026-04-20 04:30:23.118890 | 2026-04-20 04:30:23.119028 | LOOP [stage-output : Ensure target folders exist] 2026-04-20 04:30:23.597251 | orchestrator | ok: "docs" 2026-04-20 04:30:23.597643 | 2026-04-20 04:30:23.858806 | orchestrator | ok: "artifacts" 2026-04-20 04:30:24.168832 | orchestrator | ok: "logs" 2026-04-20 04:30:24.185773 | 2026-04-20 04:30:24.185933 | LOOP [stage-output : Copy files and folders to staging folder] 2026-04-20 04:30:24.219834 | 2026-04-20 04:30:24.220078 | TASK [stage-output : Make all log files readable] 2026-04-20 04:30:24.543148 | orchestrator | ok 2026-04-20 04:30:24.552297 | 2026-04-20 04:30:24.552431 | TASK [stage-output : Rename log files that match extensions_to_txt] 2026-04-20 04:30:24.587073 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:24.603712 | 2026-04-20 04:30:24.603892 | TASK [stage-output : Discover log files for compression] 2026-04-20 04:30:24.628597 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:24.639749 | 2026-04-20 04:30:24.639904 | LOOP [stage-output : Archive everything from logs] 2026-04-20 04:30:24.686546 | 2026-04-20 04:30:24.686755 | PLAY [Post cleanup play] 2026-04-20 04:30:24.696150 | 2026-04-20 04:30:24.696266 | TASK [Set cloud fact (Zuul deployment)] 2026-04-20 04:30:24.763110 | orchestrator | ok 2026-04-20 04:30:24.774645 | 2026-04-20 04:30:24.774777 | TASK [Set cloud fact (local deployment)] 2026-04-20 04:30:24.809242 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:24.828640 | 2026-04-20 04:30:24.828841 | TASK [Clean the cloud environment] 2026-04-20 04:30:26.583079 | orchestrator | 2026-04-20 04:30:26 - clean up servers 2026-04-20 04:30:27.479513 | orchestrator | 2026-04-20 04:30:27 - testbed-manager 2026-04-20 04:30:27.572667 | orchestrator | 2026-04-20 04:30:27 - testbed-node-4 2026-04-20 04:30:27.664891 | orchestrator | 2026-04-20 04:30:27 - testbed-node-0 2026-04-20 04:30:27.767750 | orchestrator | 2026-04-20 04:30:27 - testbed-node-3 2026-04-20 04:30:27.867555 | orchestrator | 2026-04-20 04:30:27 - testbed-node-5 2026-04-20 04:30:27.958962 | orchestrator | 2026-04-20 04:30:27 - testbed-node-1 2026-04-20 04:30:28.044341 | orchestrator | 2026-04-20 04:30:28 - testbed-node-2 2026-04-20 04:30:28.144353 | orchestrator | 2026-04-20 04:30:28 - clean up keypairs 2026-04-20 04:30:28.164947 | orchestrator | 2026-04-20 04:30:28 - testbed 2026-04-20 04:30:28.189514 | orchestrator | 2026-04-20 04:30:28 - wait for servers to be gone 2026-04-20 04:30:41.265301 | orchestrator | 2026-04-20 04:30:41 - clean up ports 2026-04-20 04:30:41.475268 | orchestrator | 2026-04-20 04:30:41 - 1c06c1ee-93ce-4ff1-82ee-3291ad413150 2026-04-20 04:30:41.729963 | orchestrator | 2026-04-20 04:30:41 - 81ada3bd-3822-4298-9fc6-0d2d7af84663 2026-04-20 04:30:41.988227 | orchestrator | 2026-04-20 04:30:41 - b0707f86-5249-430c-90d5-219ddad6e3af 2026-04-20 04:30:42.250801 | orchestrator | 2026-04-20 04:30:42 - c0d7958f-2972-4e3f-8f45-e999644762f9 2026-04-20 04:30:42.496807 | orchestrator | 2026-04-20 04:30:42 - daf2c5d3-02cd-409f-9ff4-c6c9c274c883 2026-04-20 04:30:43.513073 | orchestrator | 2026-04-20 04:30:43 - e7df36ab-7f7c-40e1-853b-d0174b1437d8 2026-04-20 04:30:43.738703 | orchestrator | 2026-04-20 04:30:43 - f7434a63-67fb-4940-b5f7-5a049f724b58 2026-04-20 04:30:44.150853 | orchestrator | 2026-04-20 04:30:44 - clean up volumes 2026-04-20 04:30:44.307245 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-5-node-base 2026-04-20 04:30:44.346865 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-2-node-base 2026-04-20 04:30:44.394769 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-1-node-base 2026-04-20 04:30:44.438266 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-3-node-base 2026-04-20 04:30:44.482984 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-manager-base 2026-04-20 04:30:44.535511 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-0-node-base 2026-04-20 04:30:44.581409 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-4-node-base 2026-04-20 04:30:44.639443 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-5-node-5 2026-04-20 04:30:44.686573 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-8-node-5 2026-04-20 04:30:44.732653 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-4-node-4 2026-04-20 04:30:44.778646 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-0-node-3 2026-04-20 04:30:44.826846 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-7-node-4 2026-04-20 04:30:44.869269 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-2-node-5 2026-04-20 04:30:44.919084 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-3-node-3 2026-04-20 04:30:44.969900 | orchestrator | 2026-04-20 04:30:44 - testbed-volume-6-node-3 2026-04-20 04:30:45.017062 | orchestrator | 2026-04-20 04:30:45 - testbed-volume-1-node-4 2026-04-20 04:30:45.062379 | orchestrator | 2026-04-20 04:30:45 - disconnect routers 2026-04-20 04:30:45.161241 | orchestrator | 2026-04-20 04:30:45 - testbed 2026-04-20 04:30:46.177345 | orchestrator | 2026-04-20 04:30:46 - clean up subnets 2026-04-20 04:30:46.251586 | orchestrator | 2026-04-20 04:30:46 - subnet-testbed-management 2026-04-20 04:30:46.494920 | orchestrator | 2026-04-20 04:30:46 - clean up networks 2026-04-20 04:30:46.660626 | orchestrator | 2026-04-20 04:30:46 - net-testbed-management 2026-04-20 04:30:46.999243 | orchestrator | 2026-04-20 04:30:46 - clean up security groups 2026-04-20 04:30:47.046744 | orchestrator | 2026-04-20 04:30:47 - testbed-node 2026-04-20 04:30:47.181317 | orchestrator | 2026-04-20 04:30:47 - testbed-management 2026-04-20 04:30:47.302135 | orchestrator | 2026-04-20 04:30:47 - clean up floating ips 2026-04-20 04:30:47.335444 | orchestrator | 2026-04-20 04:30:47 - 81.163.193.117 2026-04-20 04:30:47.780847 | orchestrator | 2026-04-20 04:30:47 - clean up routers 2026-04-20 04:30:47.914292 | orchestrator | 2026-04-20 04:30:47 - testbed 2026-04-20 04:30:49.885696 | orchestrator | ok: Runtime: 0:00:24.382430 2026-04-20 04:30:49.889977 | 2026-04-20 04:30:49.890156 | PLAY RECAP 2026-04-20 04:30:49.890295 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2026-04-20 04:30:49.890362 | 2026-04-20 04:30:50.045196 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2026-04-20 04:30:50.051649 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2026-04-20 04:30:50.799226 | 2026-04-20 04:30:50.799395 | PLAY [Cleanup play] 2026-04-20 04:30:50.815781 | 2026-04-20 04:30:50.815925 | TASK [Set cloud fact (Zuul deployment)] 2026-04-20 04:30:50.875662 | orchestrator | ok 2026-04-20 04:30:50.886116 | 2026-04-20 04:30:50.886273 | TASK [Set cloud fact (local deployment)] 2026-04-20 04:30:50.921544 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:50.932663 | 2026-04-20 04:30:50.932773 | TASK [Clean the cloud environment] 2026-04-20 04:30:52.132482 | orchestrator | 2026-04-20 04:30:52 - clean up servers 2026-04-20 04:30:52.772487 | orchestrator | 2026-04-20 04:30:52 - clean up keypairs 2026-04-20 04:30:52.788813 | orchestrator | 2026-04-20 04:30:52 - wait for servers to be gone 2026-04-20 04:30:52.831074 | orchestrator | 2026-04-20 04:30:52 - clean up ports 2026-04-20 04:30:52.916861 | orchestrator | 2026-04-20 04:30:52 - clean up volumes 2026-04-20 04:30:52.996940 | orchestrator | 2026-04-20 04:30:52 - disconnect routers 2026-04-20 04:30:53.038110 | orchestrator | 2026-04-20 04:30:53 - clean up subnets 2026-04-20 04:30:53.064021 | orchestrator | 2026-04-20 04:30:53 - clean up networks 2026-04-20 04:30:53.293559 | orchestrator | 2026-04-20 04:30:53 - clean up security groups 2026-04-20 04:30:53.327215 | orchestrator | 2026-04-20 04:30:53 - clean up floating ips 2026-04-20 04:30:53.357778 | orchestrator | 2026-04-20 04:30:53 - clean up routers 2026-04-20 04:30:53.968213 | orchestrator | ok: Runtime: 0:00:01.656809 2026-04-20 04:30:53.972427 | 2026-04-20 04:30:53.972702 | PLAY RECAP 2026-04-20 04:30:53.972846 | orchestrator | ok: 2 changed: 1 unreachable: 0 failed: 0 skipped: 1 rescued: 0 ignored: 0 2026-04-20 04:30:53.972908 | 2026-04-20 04:30:54.100869 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2026-04-20 04:30:54.102062 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2026-04-20 04:30:54.851397 | 2026-04-20 04:30:54.851581 | PLAY [Base post-fetch] 2026-04-20 04:30:54.867319 | 2026-04-20 04:30:54.867450 | TASK [fetch-output : Set log path for multiple nodes] 2026-04-20 04:30:54.923618 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:54.937252 | 2026-04-20 04:30:54.937532 | TASK [fetch-output : Set log path for single node] 2026-04-20 04:30:54.987689 | orchestrator | ok 2026-04-20 04:30:54.997193 | 2026-04-20 04:30:54.997352 | LOOP [fetch-output : Ensure local output dirs] 2026-04-20 04:30:55.514039 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/bf20320bb57b40e38431504705879859/work/logs" 2026-04-20 04:30:55.811313 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/bf20320bb57b40e38431504705879859/work/artifacts" 2026-04-20 04:30:56.099049 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/bf20320bb57b40e38431504705879859/work/docs" 2026-04-20 04:30:56.128381 | 2026-04-20 04:30:56.128782 | LOOP [fetch-output : Collect logs, artifacts and docs] 2026-04-20 04:30:57.092719 | orchestrator | changed: .d..t...... ./ 2026-04-20 04:30:57.093118 | orchestrator | changed: All items complete 2026-04-20 04:30:57.093194 | 2026-04-20 04:30:57.839883 | orchestrator | changed: .d..t...... ./ 2026-04-20 04:30:58.589727 | orchestrator | changed: .d..t...... ./ 2026-04-20 04:30:58.618733 | 2026-04-20 04:30:58.618909 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2026-04-20 04:30:58.654534 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:58.657844 | orchestrator | skipping: Conditional result was False 2026-04-20 04:30:58.678713 | 2026-04-20 04:30:58.678892 | PLAY RECAP 2026-04-20 04:30:58.678988 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2026-04-20 04:30:58.679030 | 2026-04-20 04:30:58.814726 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2026-04-20 04:30:58.817270 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2026-04-20 04:30:59.575156 | 2026-04-20 04:30:59.575322 | PLAY [Base post] 2026-04-20 04:30:59.590042 | 2026-04-20 04:30:59.590175 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2026-04-20 04:31:00.717425 | orchestrator | changed 2026-04-20 04:31:00.727170 | 2026-04-20 04:31:00.727302 | PLAY RECAP 2026-04-20 04:31:00.727381 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2026-04-20 04:31:00.727456 | 2026-04-20 04:31:00.851894 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2026-04-20 04:31:00.852976 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2026-04-20 04:31:01.673358 | 2026-04-20 04:31:01.673618 | PLAY [Base post-logs] 2026-04-20 04:31:01.684777 | 2026-04-20 04:31:01.684923 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2026-04-20 04:31:02.168798 | localhost | changed 2026-04-20 04:31:02.178941 | 2026-04-20 04:31:02.179094 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2026-04-20 04:31:02.206258 | localhost | ok 2026-04-20 04:31:02.209302 | 2026-04-20 04:31:02.209406 | TASK [Set zuul-log-path fact] 2026-04-20 04:31:02.224263 | localhost | ok 2026-04-20 04:31:02.233887 | 2026-04-20 04:31:02.234005 | TASK [set-zuul-log-path-fact : Set log path for a build] 2026-04-20 04:31:02.258967 | localhost | ok 2026-04-20 04:31:02.262386 | 2026-04-20 04:31:02.262501 | TASK [upload-logs : Create log directories] 2026-04-20 04:31:02.805732 | localhost | changed 2026-04-20 04:31:02.811257 | 2026-04-20 04:31:02.811425 | TASK [upload-logs : Ensure logs are readable before uploading] 2026-04-20 04:31:03.324034 | localhost -> localhost | ok: Runtime: 0:00:00.006901 2026-04-20 04:31:03.333825 | 2026-04-20 04:31:03.334061 | TASK [upload-logs : Upload logs to log server] 2026-04-20 04:31:03.921970 | localhost | Output suppressed because no_log was given 2026-04-20 04:31:03.925994 | 2026-04-20 04:31:03.926189 | LOOP [upload-logs : Compress console log and json output] 2026-04-20 04:31:03.985235 | localhost | skipping: Conditional result was False 2026-04-20 04:31:03.991212 | localhost | skipping: Conditional result was False 2026-04-20 04:31:03.998830 | 2026-04-20 04:31:03.999113 | LOOP [upload-logs : Upload compressed console log and json output] 2026-04-20 04:31:04.050297 | localhost | skipping: Conditional result was False 2026-04-20 04:31:04.051041 | 2026-04-20 04:31:04.054722 | localhost | skipping: Conditional result was False 2026-04-20 04:31:04.060176 | 2026-04-20 04:31:04.060308 | LOOP [upload-logs : Upload console log and json output]